# Section 2.4 Answers - Mathematics

1. (y=frac{1}{1-ce^{x}})

2. (y=x^{2/7}(c-ln |x|)^{1/7})

3. (y=e^{2/x}(c-1/x)^{2})

4. (y=pmfrac{sqrt{2x+c}}{1+x^{2}})

5. (y=pm (1-x^{2}+ce^{-x^{2}})^{-1/2})

6. (y=left[frac{x}{3(1-x)+ce^{-x}} ight] ^{1/3})

7. (y=frac{2sqrt{2}}{sqrt{1-4x}})

8. (y=left[1-frac{3}{2}e^{-(x^{2}-1)/4} ight]^{-2})

9. (y=frac{1}{x(11-3x)^{1/3}})

10. (y=(2e^{x}-1)^{2})

11. (y=(2e^{12x}-1-12x)^{1/3})

12. (y=left[frac{5x}{2(1+4x^{5})} ight]^{1/2})

13. (y=(4e^{x/2}-x-2)^{2})

14. (P=frac{P_{0}e^{at}}{1+aP_{0}int_{0}^{t}alpha ( au )e^{a au }d au };quadlim_{t oinfty }P(t)=left{egin{array}{cc}{infty }&{ ext{if }L=0,}{0}&{ ext{if }L=infty ,}{1/aL}&{ ext{if }0

15. (y=x(ln |x|+c))

17. (y=pm x(4ln |x|+c)^{1/4})

18. (y=xsin ^{-1}(ln |x|+c))

19. (y=x an (ln |x|+c))

20. (y=pm xsqrt{cx^{2}-1})

21. (y=pm xln (ln |x|+c))

22. (y=-frac{2x}{2ln |x|+1})

23. (y=x(3ln x+27)^{1/3})

24. (y=frac{1}{x}left(frac{9-x^{4}}{2} ight)^{1/2})

25. (y=-x)

26. (y=-frac{x(4x-3)}{(2x-3)})

27. (y=xsqrt{4x^{6}-1})

28. ( an ^{-1}frac{y}{x}-frac{1}{2}ln (x^{2}+y^{2})=c)

29. ((x+y)ln |x|+y(1-ln |y|)+cx=0)

30. ((y+x)^{3}=3x^{3}(ln |x|+c))

34. (frac{y}{x}+frac{y^{3}}{x^{3}}=ln |x|+c)

40. Choose (X_{0}) and (Y_{0}) so that

[aX_{0}+bY_{0}=alpha onumber] [cX_{0}+dY_{0}=eta onumber]

44. ((y_{1}=x^{1/3}y=x^{1/3}(ln |x|+c)^{1/3})

48. (y_{1}= an x; y= an x an (ln | an x|+c)

52. (y=frac{-3+sqrt{1+60x}}{2x})

53. (y=frac{-5+sqrt{1+48x}}{2x^{2}})

56. (y=1+frac{1}{x+1+ce^{x}})

57. (y=e^{x}-frac{1}{1+ce^{-x}})

58. (y=1-frac{1}{x(1-cx)})

59. (y=x-frac{2x}{x^{2}+c})

## Section 2.4 Answers - Mathematics

In this section we are going to submerge a vertical plate in water and we want to know the force that is exerted on the plate due to the pressure of the water. This force is often called the hydrostatic force.

There are two basic formulas that we’ll be using here. First, if we are (d) meters below the surface then the hydrostatic pressure is given by,

where, ( ho ) is the density of the fluid and (g) is the gravitational acceleration. We are going to assume that the fluid in question is water and since we are going to be using the metric system these quantities become,

The second formula that we need is the following. Assume that a constant pressure (P) is acting on a surface with area (A). Then the hydrostatic force that acts on the area is,

Note that we won’t be able to find the hydrostatic force on a vertical plate using this formula since the pressure will vary with depth and hence will not be constant as required by this formula. We will however need this for our work.

The best way to see how these problems work is to do an example or two.

The first thing to do here is set up an axis system. So, let’s redo the sketch above with the following axis system added in.

So, we are going to orient the (x)-axis so that positive (x) is downward, (x = 0) corresponds to the water surface and (x = 4) corresponds to the depth of the tip of the triangle.

Next we break up the triangle into (n) horizontal strips each of equal width (Delta x) and in each interval (left[ <<>>,> ight]) choose any point (x_i^*). In order to make the computations easier we are going to make two assumptions about these strips. First, we will ignore the fact that the ends are actually going to be slanted and assume the strips are rectangular. If (Delta x) is sufficiently small this will not affect our computations much. Second, we will assume that (Delta x) is small enough that the hydrostatic pressure on each strip is essentially constant.

Below is a representative strip.

The height of this strip is (Delta x) and the width is 2(a). We can use similar triangles to determine (a) as follows,

Now, since we are assuming the pressure on this strip is constant, the pressure is given by,

[ = ho gd = 1000left( <9.81> ight)x_i^* = 9810x_i^*]

and the hydrostatic force on each strip is,

[ = ,A = left( <2aDelta x> ight) = 9810x_i^*left( 2 ight)left( <3 - frac<3><4>x_i^*> ight)Delta x = 19620x_i^*left( <3 - frac<3><4>x_i^*> ight),Delta x]

The approximate hydrostatic force on the plate is then the sum of the forces on all the strips or,

Taking the limit will get the exact hydrostatic force,

Using the definition of the definite integral this is nothing more than,

The hydrostatic force is then,

Let’s take a look at another example.

First, we’re going to assume that the top of the circular plate is 6 meters under the water. Next, we will set up the axis system so that the origin of the axis system is at the center of the plate. Setting the axis system up in this way will greatly simplify our work.

Finally, we will again split up the plate into (n) horizontal strips each of width (Delta y) and we’ll choose a point (y_i^*) from each strip. We’ll also assume that the strips are rectangular again to help with the computations. Here is a sketch of the setup.

The depth below the water surface of each strip is,

and that in turn gives us the pressure on the strip,

The area of each strip is,

The hydrostatic force on each strip is,

The total force on the plate is,

To do this integral we’ll need to split it up into two integrals.

The first integral requires the trig substitution (y = 2sin heta ) and the second integral needs the substitution (v = 4 - ). After using these substitutions we get,

Note that after the substitution we know the second integral will be zero because the upper and lower limit is the same.

## MathHelp.com

#### Solve (x + 1)(x &ndash 3) = 0 .

To solve this quadratic equation, I could multiply out the expression on the left-hand side, simplify to find the coefficients, plug those coefficient values into the Quadratic Formula, and chug away to the answer.

But why on Earth would I? I mean, for heaven's sake, this is factorable, and they've already factored it and set it equal to zero for me. While the Quadratic Formula would definitely give me the correct answer, why bother with it?

Instead, I'll just immediately solve the two factors they've given me:

That was quick! And my answer is:

By the way, there is no strict order for the solutions. Yes, I generally put my solutions in numerical order so, in the above case, the minus answer came before the plus answer. But, unless your instructor has said something (and I'd be amazed if this were the case), the above answer would be just as correct if it had been written as " x = 3, &ndash1 ".

#### Solve x 2 + x &ndash 4 = 0 .

The quadratic expression on the left-hand side of the "equals" sign does not factor.

(How did I very quickly know this? To be factorable, there must be integer factors of ac = (1)(&ndash4) = &ndash4 which sum to b = 1 . I can see that there's aren't any.)

This quadratic has not been provided to me in "(variable part) 2 equals (some number)", so solving by taking square roots is out.

I could solve this equation by completing the square, but that's tiresome and error-prone. I could try solving by doing a graph, but the best I'd be able to do is get a decimal approximation from my "software" (that is, my graphing calculator).

Since the instructions didn't mention anything about decimal approximations, I'll leave my answer in square-root form:

#### Solve x 2 &ndash 3x &ndash 4 = 0 .

This equation isn't set up for me as being ready for taking square roots, and I'm never gonna use completing the square unless they specifically tell me to. Before applying the Quadratic Formula, though, I'll first quickly check to see if the expression on the left-hand side of this equation is factorable.

Are there integer factors of ac = (1)(&ndash4) = &ndash4 which sum to &ndash3 ? Yes: &ndash4 and +1 . So this quadratic is factorable, and I've already found the numbers to use to factor this (because the leading coefficient is 1 ):

And I'm done, just that quickly. My answer is:

#### Solve x 2 &ndash 4 = 0 .

The quadratic expression on the left-hand side of this equation has just two terms, and nothing factors out of both, so I won't be using simple factoring techniques. But I note that this is a difference of squares, and I know that I can factor a difference of squares.

Note: I could have moved the 4 over to the right-hand side of the equation, and then taken the square root of either side of x 2 = 4 . This method would have given me the exact same answer as the factoring done above. Unless specified otherwise, you should use whichever method you prefer.

#### Solve 6x 2 + 11 x &ndash 35 = 0 .

The quadratic expression on the left-hand side of this equation might factor, but it looks like finding that factorization, if any, would be an unpleasant amount of work. I'm feeling a bit mindless and lazy at the moment, so I'll use the Quadratic Formula instead. As I work, I need to remember to put the ± in front of the radical, and to put the fraction line under the entire numerator, being the whole " b 2 ± (the square root)" part:

The solution values are fractions with no radicals, which means the quadratic could have been factored. But I've got my answer now, so I don't care about factorization any more.

#### Solve x 2 &ndash 48 = 0 .

This quadratic expression has two terms, and nothing factors out, so either it's a difference of squares (which I can factor) or else it can be formatted as "(variable part) 2 equals (a number)" so I can square-root both sides. Since 48 is not a square, I can't apply the difference-of-squares formula. Instead, I'll have to square-root both sides:

So my answer, in exact form, is:

Note: Unless you're specifically told to provide a decimal approximation for solutions that include radicals, you should assume that they're wanting you to give the "exact" form of the answer that is, that they're wanting to see those square roots.

#### Solve x 2 &ndash 7x = 0 .

This quadratic expression has two terms, and they factor easily:

#### Find the solutions of the quadratic represented by the table below:

Before I panic, I think about the one method of "solving" that doesn't involve an actual quadratic equation: solving by graphing.

When they want me to solve a quadratic equation by graphing, they're actually asking me to find the x -intercepts of the associated quadratic function. And, by "find", they mean "from the pretty picture". But the point is that they're wanting me to note the connection between the two, and provide then with the x -values for when y = 0 .

I can do this from a picture, or I can do this from a T-chart of values. In this case, instead of a graph, they've given me a table. There are two points which have one of the coordinates equal to zero namely, (0, 9) and (3, 0) . Which of these is the one I want? The one that has y = 0 , which is the second of the two points. And my solution is the corresponding x -value.

You won't likely see many, or maybe any, of this last sort of exercise.

By the way, if you're wondering why there was only one solution to that quadratic, it's because the (intended and underlying) equation was (x &ndash 3) 2 = 0 . So the one solution was "repeated".

When solving quadratic equations in general, first get everything over onto one side of the "equals" sign (something that was already done in the above examples). Then first check to see if there is an obvious factoring or if there is an obvious square-rooting that you can do. If not, then it's usually best to resort to the Quadratic Formula. But don't use the Quadratic Formula for everything while it will always give you the answer &mdash eventually &mdash it is not always the fastest method. And speed can count for a lot on timed tests.

## Math 1311 Homework 4 (Section 2.4 & Section 2.6) Record your answers to all the problems in the.

Math 1311 Homework 4 (Section 2.4 & Section 2.6) Record your answers to all the problems in the EMCF titled “ Homework 4” . x x 1. Solve using the crossing-graphs method: 12 + 3x = 9 + 3. Round your answer to two decimal places. a) 0.68 b) 0.98 c) 1.28 d) 0.33 3 2 2. Find the positive solution using the crossing-graphs method: 4x - 7x = 4 - 2x . Round your answer to two decimal places. a) 1.55 b) 2.35 c) 1.95 d) 1.35 3. A breeding group of foxes is introduced into a protected area, and the population growth follows a logistic pattern. After t years the population of foxes is given by 39.68 N = foxes. t 0.30? 0.88 When will the fox population reach 78 individuals? Round your answer to two decimal places. a) after 14.26 years b) after 16.01 years c) after 19.85 years d) after 12.26 years -0.09t 4. The temperature C of a fresh cup of coffee t minutes after it is poured is given by C = 55e + 82 degrees Fahrenheit. The coffee is cool enough to drink when its temperature is 93.91 degrees. When will the coffee be cool enough to drink? a) 13 minutes after it is poured b) 27 minutes after it is poured c) 23 minutes after it is poured d) 17minutes after it is poured5. The temperature C of a fresh cup of coffee t minutes after it is poured is given by C = -0.05t 125e + 77 degrees Fahrenheit. What is the temperature of the coffee in the pot? (Note: We are assuming that the coffee pot is being kept hot and is the same temperature as the cup of coffee when it was poured.) a) 202 degrees Fahrenheit b) 279 degrees Fahrenheit c) 164 degrees Fahrenheit d) 83 degrees Fahrenheit 6. The temperature C of a fresh cup of coffee t minutes after it is poured is given by C = -0.03t 128e + 67 degrees Fahrenheit. What is the temperature in the room where you are drinking the coffee? (Hint: If the coffee is left to cool a long time, it will reach room temperature.) Round your.

## CHAPTER 4 REVIEW

This chapter introduced you to quadratics. The two major topics are the quadratic formula and graphs of quadratics. These topics have many applications in business, physics, and geometry. Factoring is an important topic in MAT 100, Intermediate Algebra.

### Section 4.2: Applications of the Quadratic Formula

Definition: ax 2 + bx + c = 0 is the quadratic equation.

Example 4. A farmer wants to enclose two adjacent chicken coops against a barn. He has 125 feet of fence. What should the dimensions be if he wants the total area to be 700 square feet.

a. Complete the table to find the equation for area.

The dimensions of the chicken coop that will yield an area of 700 square feet are 35 by 20 feet and 6.667 by 105 feet.
(To get the length divide 700 by 6.667 and 35.)

### Section 4.3: Quadratic Applications and Graphs

1. The vertex:
The x coordinate is computed with the formula
The y coordinate is computed by substituting the x coordinate into y = ax 2 + bx + c.
2. The x intercept:
Set y = 0 and solve 0 = ax 2 + bx + c using the quadratic formula.
3. The y intercept:
Substitute x = 0 into y = ax 2 + bx + c . Note that when x = 0, y = c.

Example 5. The cost equation for making juice boxes is C = 0.6B 2 - 24B + 36, and the revenue equation is R = -0.4B 2 + 18B . B is in millions, and C and R are in thousands of dollars.

a. Find the profit equation.

b. Graph the profit equation and explain what the vertex, B, and P intercepts mean in terms of the problem.

Find the B intercept. Set P = 0.

The B intercepts are (0.875, 0) and (41.13, 0).

Find the P intercept. Set B = 0.

The P intercept is (0, -36).

c. Suppose the company needs to earn $200,000 in profit (P = 200). Graph the line P = 200 and find how many juice boxes the company needs to make to earn$200,000.

The company needs to make 6.682 or 35.32 million juice boxes in order to earn $200,000 in profits. The vertex (21,405) represents the maximum profit. The company will obtain its maximum profit of$405,000 when they sell 21 million juice boxes.

The B intercepts (0.875, 0) and (41.13, 0) tell us that the company will break even if they sell .875 or 41.13 million juice boxes.

The P intercept (0. -36) represents the company's start up costs of $36,000. ## Two Column Proofs In these lessons, we will learn how to use two column proofs for geometric proofs. A two-column proof consists of a list of statements, and the reasons why those statements are true. The statements are in the left column and the reasons are in the right column. The statements consists of steps toward solving the problem. The following figure gives a Two-column Proof for the Isosceles Triangle Theorem. Scroll down the page for more examples and solutions. #### Two-Column Proof (5 steps) Practice 1 Practice writing a 2 column proof. Example: Given AD = 8, BC = 8, B̅C̅ ≅ C̅D̅ Prove: A̅D̅ ≅ C̅D̅ #### Two-Column Proof (7 steps) Practice 2 Practice writing two column proofs. Example: Given D̅E̅ ≅ F̅G̅ Prove: x = 4 #### Proof Practice (5 steps) Practice 3 Practice writing two column proofs. Example: Given MN = PQ Prove: MP = NQ #### Practice Proof 4 (Use angle addition postulate) Practice writing 2 column proofs. Example: Given m∠RPS = m∠TPC, m∠TPV = m∠SPT Prove: m∠RPV = 3(m∠RPS) #### How To Use The Two-Column Proof To Prove The Isosceles Triangle Theorem? The Isosceles Triangle Theorem states that if two sides of a triangle are congruent, then the angles opposite the sides are congruent. #### How To Use The Two-Column Proof To Prove The Exterior Angle Theorem? The Exterior Angle Theorem states that the sum of the remote interior angles is equal to the non-adjacent exterior angle. #### How To Use Two Column Proof To Show Segments Are Perpendicular? Use the SSS, SAS, ASA, AAS postulates. Using triangle congruency postulates to show that two intersecting segments are perpendicular. (Diagonals of a kite) #### How To Use Two Column Proof To Prove Parallel Lines? Given ∠2 ≅ ∠1 ≅ ∠3 Prove: A̅B̅ || C̅D̅ #### Proving A Quadrilateral Is A Parallelogram | Geometry Proof This video geometry lesson proves two parallelogram theorems using the two column proof. Proof 1: If the diagonals of a quadrilateral bisect each other, then the quadrilateral is a parallelogram. Proof 2: If both pairs of opposite sides of a quadrilateral are congruent, then the quadrilateral is a parallelogram. Theorems Used: If both pairs of opposite angles of a quadrilateral are congruent, then the quadrilateral is a parallelogram and If one pair of opposite sides of a quadrilateral is both congruent and parallel, then the quadrilateral is a parallelogram to solve problems. #### Special Parallelograms - Rhombus And Rectangle Proofs This video uses the two column method to prove two theorems. Proof 1: The diagonals of a rectangle are congruent. This amounts to be a triangle proof to use CPCTC. Proof 2: The diagonals of a rhombus are perpendicular. Try the free Mathway calculator and problem solver below to practice various math topics. Try the given examples, or type in your own problem and check your answer with the step-by-step explanations. We welcome your feedback, comments and questions about this site or page. Please submit your feedback or enquiries via our Feedback page. ## Section 2.4 Answers - Mathematics The presidential election of 1936 pitted Alfred Landon, the Republican governor of Kansas, against the incumbent President, Franklin D. Roosevelt. The year 1936 marked the end of the Great Depression, and economic issues such as unemployment and government spending were the dominant themes of the campaign. The Literary Digest was one of the most respected magazines of the time and had a history of accurately predicting the winners of presidential elections that dated back to 1916. For the 1936 election, the Literary Digest prediction was that Landon would get 57% of the vote against Roosevelt's 43% (these are the statistics that the poll measured). The actual results of the election were 62% for Roosevelt against 38% for Landon (these were the parameters the poll was trying to measure). The sampling error in the Literary Digest poll was a whopping 19%, the largest ever in a major public opinion poll. Practically all of the sampling error was the result of sample bias. The irony of the situation was that the Literary Digest poll was also one of the largest and most expensive polls ever conducted, with a sample size of around 2.4 million people! At the same time the Literary Digest was making its fateful mistake, George Gallup was able to predict a victory for Roosevelt using a much smaller sample of about 50,000 people. This illustrates the fact that bad sampling methods cannot be cured by increasing the size of the sample, which in fact just compounds the mistakes. The critical issue in sampling is not sample size but how best to reduce sample bias. There are many different ways that bias can creep into the sample selection process. Two of the most common occurred in the case of the Literary Digest poll. The Literary Digest's method for choosing its sample was as follows: Based on every telephone directory in the United States, lists of magazine subscribers, rosters of clubs and associations, and other sources, a mailing list of about 10 million names was created. Every name on this lest was mailed a mock ballot and asked to return the marked ballot to the magazine. One cannot help but be impressed by the sheer ambition of such a project. Nor is is surprising that the magazine's optimism and confidence were in direct proportion to the magnitude of its effort. In its August 22, 1936 issue, the Litereary Digest announced: Once again, [we are] asking more than ten million voters -- one out of four, representing every county in the United States -- to settle November's election in October. Next week, the first answers from these ten million will begin the incoming tide of marked ballots, to be triple-checked, verified, five-times cross-classified and totaled. When the last figure has been totted and checked, if past experience is a criterion, the country will know to within a fraction of 1 percent the actual popular vote of forty million [voters]. There were two basic causes of the Literary Digest's downfall: selection bias and nonresponse bias. The first major problem with the poll was in the selection process for the names on the mailing list, which were taken from telephone directories, club membership lists, lists of magazine subscibers, etc. Such a list is guaranteed to be slanted toward middle- and upper-class voters, and by default to exclude lower-income voters. One must remember that in 1936, telephones were much more of a luxury than they are today. Furthermore, at a time when there were still 9 million people unemployed, the names of a significant segment of the population would not show up on lists of club memberships and magazine subscribers. At least with regard to economic status, the Literary Digest mailing list was far from being a representative cross-seciton of the population. This is always a critical problem because voters are generally known to vote their pocketbooks, and it was magnified in the 1936 election when economic issues were preeminent in the minds of the voters. This sort of sample bias is called selection bias. The second problem with the Literary Digest poll was that out of the 10 million people whose names were on the original mailing list, only about 2.4 million responded to the survey. Thus, the size of the sample was about one-fourth of what was originally intended. People who respond to surveys are different from people who don't, not only in the obvious way (their attitude toward surveys) but also in more subtle and significant ways. When the response rate is low (as it was in this case, 0.24), a survey is said to suffer from nonresponse bias. This is a special type of selection bias where reluctant and nonresponsive people are excluded from the sample. Dealing with nonresponse bias presents its own set of difficulties. We can't force people to participate in a survey, and paying them is hardly ever asolution since it can introduce other forms of bias. There are ways, however, of minimizing nonresponse bias. For example, the Literary Digest survey was conducted by mail. This approach is the most likely to magnify nonresponse bias because people often consider a mailed questionnaire just another form of junk mail. Of course, considering the size of the mailing list, the Literary Digest really had no other choice. Here again is an illustration of how a big sample size can be more of a liability than an asset. Nowadays, almost all legitimate public opinion polls are conducted either by telephone or by personal interviews. Telephone polling is subject to slightly more nonresponse bias than personal interviews, but it is considerably cheaper. Even today, however, a significant segment of the population has no telephone in their homes (in fact, a significant segment of the population has no homes), so that selection bias can still be a problem in telephone surveys. The most extreme form of nonresponse bias occurs when the sample consists only of those individuals who step forward and actually "volunteer" to be in the sample. A blatant example of this is the 900-number telephone polls, in which an individual not only has to step forward, but he or she actually has to pay to do so. It goes without saying that people who are willing to pay to express their opinions are hardly representative of the general public and that information collected from such polls should be considered suspect at best. ## California Considers ‘Equitable Math’ Because Goal of Getting Correct Answer is ‘Racist’ In today’s progressive society, even answering a math problem correctly is racist. The California education department is weighing a statewide math framework that perpetuates the idea that working to solve a problem in math is an example of racism and white supremacy in schools. The framework, titled “A Pathway to Equitable Math Instruction: Dismantling Racism in Mathematics Instruction” would provide “exercises for educators to reflect on their own biases to transform their instructional practice.” The website states its training manual was funded by the Bill and Melinda Gates Foundation, Breitbart reports. “White supremacy culture infiltrates math classrooms in everyday teacher actions,” the document states. “Coupled with the beliefs that underlie these actions, they perpetuate educational harm on Black, Latinx, and multilingual students, denying them full access to the world of mathematics.” The framework provides examples of how “white supremacy” takes over math classes. • The focus is on getting the “right answer • Independent practice is valued over teamwork or collaboration • “Real-world math” is valued over math in the real world • Students are tracked (into courses/pathways and within the classroom) • Participation structures reinforce dominant ways of being The document identifies the ways teachers perpetuate white supremacy in their assessments. • Students are required to “show their work” • Grading practices are focused on lack of knowledge • Language acquisition is equated with mathematical proficiency “These common practices that perpetuate white supremacy culture create and sustain institutional and systemic barriers to equity for Black, Latinx, and Multilingual students. In order to dismantle these barriers, we must identify what it means to be an antiracist math educator,” the framework continues. “In order to embody antiracist math education, teachers must engage in critical praxis that interrogates the way in which they perpetuate white supremacy culture in their own classrooms, and develop a plan toward antiracist math education to address issues of equity,” it adds. If you thought critical race theory is the worst it could get, you clearly thought wrong. ## Klinikka Helena Klinikka Helena on ylpeä voidessaan tarjota sinulle parasta suomalaista terveydenhoitoa. Hoidamme rintasyöpää parhailla nykyaikaisilla metodeilla, joiden ansiosta rintasyövästä selviää 5 vuoden jälkeen 91-92% ja vielä 10 vuoden jälkeen 86%. Olemme erikoistuneita rintojen muotoiluun (mm. suurennus, pienennys ja symmetriakorjaukset) ja erityisesti rintojen rakentamiseen rintasyövän jälkeen. Hoidamme kotimaisia ja ulkomaalaisia lämmöllä ja ammattitaidolla. ## Section 2.4 Answers - Mathematics Is it really the case that the non-linguistically inclined student who progresses through math with correct but unexplained answers–from multi-digit arithmetic through to multi-variable calculus–doesn’t understand the underlying math? Or that the mathematician with the Asperger’s personality, doing things headily but not orally, is advancing the frontiers of his field in a zombie-like stupor? I wouldn’t bet that the student with correct but unexplained answers understands nothing, but I wouldn’t make any confident bets on exactly what that student understands either. Math answers aren’t math understanding any more than the destination of your car trip indicates the route you took. When five people arrive at the same destination, asking how each arrived tells you vastly more about the city, its traffic patterns, and the drivers, than just knowing they arrived. Their other exemplar of understanding-without-explaining is strange also. Mathematicians advance the frontiers of their field exactly by explaining their answers — in colloquia, in proofs, in journals. Those proofs are some of the most rigorous and exacting explanations you’ll find in any field. Those explanations aren’t formulaic, though. Mathematicians don’t restrict their explanations to fragile boxes, columns, and rubrics. Beals and Garelick have a valid point that teachers and schools often constrain the function (understanding) to form (boxes, columns, and rubrics). When students are forced to contort explanations to simple problems into complicated graphic organizers, like the one below from their article, we’ve lost our way. Understanding is the goal. The answer, and even the algebraic work, only approximate that goal. (Does the student know what 󈭀” means in the problem, for example? I have no idea.) Let’s be inflexible in the goal but flexible about the many developmentally appropriate ways students can meet it. Featured Comments Really too many to call out individually, but I’ll try. Yet another important thing about students explaining their reasoning is that there is great self-help in a careful explanation of processes. How often have we had a student explain a problem he/she did incorrectly and, in the explanation, the student realizes the mistake without a word from us? This “out-loud-silently-in-my-head” thinking is such an important thing to help students develop. David Wees articulates a similar point: Another reason that we might want to listen or read a student explanation of how they solved a problem is just so, in the process of articulating their solution, students may run into their own inconsistencies in their work. I have noticed, quite often, that students will give an answer that I don’t understand, and then when I ask them to explain what they did, in the middle of their explanation they say something like, “Oh, oops! Yeah that isn’t right. I mean this instead” and revise their thinking. Mathematicians use words in describing their discoveries all the time — and have for a long time. That’s why some doctorates in mathematics require a foreign language so that the candidate can read the mathematicians’ writings in the original language. 1) Can the traditionalist and progressives find a lesson/activity/short video that they both agree is lovely teaching. 2) Same thing, but they both agree it’s lousy teaching. 3) Can each identify a whatever that they like, but their sparring partner doesn’t. Can they explain why. 4) If the disagreement persists, can they explain why they think it does? Elizabeth Statmore uses an explanation protocol called Talking Points and brings student voices into the conversation. Tracy Zager excerpts quotes from mathematicians on the value of explanation in their own work. There is also an exchange between Brett Gilland and Ze’ev Wurman that lays bare two views on teaching that are completely distinct. It’s devastating. You’ll find another great exchange between Brett Gilland and Katharine Beals (search their names throughout the comments) which ends rather unconventionally for Internet-based discussions of math education. 2015 Nov 21 Katharine Beals has responded on her blog to the objections raised by commenters here. Parts #1, #2, #3. 2015 Nov 28 Education Realist has posted a response that dives into the difference between elementary math ed (the site of Garelick & Beal’s research) and middle school math ed (the site of Garlelick & Beal’s arguments). ### 92 Comments #### Bob Donaldson Great points. Understanding is the goal and simply swapping one rigid formulaic representation for another does not ensure understanding. Creative flexibility as we move toward the goal seems like good advice. #### Denise Gaskins Setting aside the autistic genius category, most students come to understand their own thinking better as they struggle to put it into words. We adults need to recognize that putting ideas into words is often a struggle. We need to give kids time, to allow for intuitive explanations, to recognize the kernel of truth in their flailing attempts and help them draw it out. Math explanations, IMO, should be oral and informal, at least in the elementary and middle school years. Written explanations are much harder than oral ones, just as essay writing is much more difficult than relating a story to your friends. And most of all, written explanations should NEVER be required on elementary-level high-stakes standardized tests. #### Brett Gilland What Bob said. Also, I read the entire article thinking that the answer here is professionals who use their professional discretion to engage student minds and do their best to inductively determine what is happening inside those minds. A refrain that my students hear frequently is that I need to find out what is happening in your mind and I am a pretty terrible mind reader… unless they give me clues. I am pretty wide open as to what constitutes a clue, but it is definitely more than just a bald answer. Finally, if what we are actually complaining about here is SBACC and PARCC tests, I am more open to what she is saying. I trust the written response section of AP mathematics exams because they are graded by professionals with demonstrated expertise using clear rubrics, with problem cases brought before the body of graders and discussed. This is expensive and the testing companies show no actual interest in such an involved process run by and through educational professionals. Failing that, just give them a computer gradable test and at least let the mandatory BS be less painful for everyone involved. The gain in estimation of understanding doesn’t justify the increase in stress and frustration caused by a dodgy free response input and grading system. #### Lisa Soltani I do ask students to explain their mathematical reasoning because it helps me assess what they actually know, and because clearly explaining their reasoning in writing is a skill worthy of development, and one that students should be developing across disciplines. That said, I also make the point that writing out “I multiplied 3 times 8 to get 24 feet in 8 yards because there are 3 feet in a yard” is in no way superior than writing: 𔄛 feet per yard x 8 yards = 32 feet.” Some students find thinking symbolically simpler others don’t. I am relatively indifferent. However, I do think clarity matters, and explanations can tell you a great about how much a student really understands. Consider these responses to the following question from a preassessment I recently gave my sixth grade students. When dividing a whole number by a fraction, would you expect the quotient to be greater than or less than the whole number? Explain your answer. You may use any pictures, numbers and/or words to explain your thinking. Responses many from my students (many who could accurately compute a quotient on the same assessment) included: “greater, because dividing a whole # by a frac it is the equivalent of whole x whole” “what’s quotient? can’t figure out” “less, a fraction means ‘a part’ also multiplication can mean ‘of’ so when you multiply a whole by a fraction it is part of a whole number” “less because if you want to separate something you need to take away!” “less than, I think it would be less than because if you are dividing a whole number by a fraction you would half [sic] to make the whole number a fraction too” “greater because you are reversing the order of doing it with the whole numbers” “lower because when you mult a number by one it stays the same number because it is” Incidentally, not one student chose to draw a picture. Given the cloudy thinking and clumsy communication, my students definitely have some work to do around understanding fraction division, even if they already know how to find the answer. #### Blaise Dan, I agree with you that “understanding is the goal”. I don’t think that the article inferred that a correct answer is the goal or is sufficient in showing understanding. The authors believe that the algebraic solution for the “coat sale” question was sufficient. I feel that there is value in struggling to return to the context of the question and be able to explain your reasoning. The Common core demands this as well. The authors suggest that this is a wasted of time I strongly disagree. The common core does a good job of explaining the need to decontextualize AND contextualize in problem-solving. (http://www.corestandards.org/Math/Practice/) I really like the analogy provided a couple of years ago by Chris Hunter in his blog post comparing play-by-play broadcasters colour commentators in sports (https://reflectionsinthewhy.wordpress.com/2013/10/26/less-play-by-play-more-colour-commentary/). Ideally, students can provide insightful, descriptive feedback about how a problem is solved. In my opinion, this is a worthwhile pursuit in our quest for mathematical understanding. #### Greg Ashman When did we decide that maths needs to be explained in words? I am quite insistent on my students providing explanations a call them ‘workings’ and they are the series of mathematical steps that they have followed to arrive at their answer. This is how things are explained in mathematics. However, for some reason this does not show understanding. In order to understand mathematics, we need to be able to waffle on about it in English. And yet mathematics was invented in order to make it easier to express notions that are cumbersome to express with words. That’s part of the beauty of it. It is as if we were to insist that the only way to understand German is to translate it into English that reasoning in German alone does not show an understanding of German. Clearly, translation is a useful device for novice learners and a key component of teaching, but being able to work entirely within the target language is a sign of sophistication rather than of a lack of understanding. Every year, I teach my senior physicists about wave particle duality. Light, I suggest, can be thought of as a wave or as a particle, depending on the situation. “But what,” they ask, “actually is it?” “Ah,” I say, “If you really want to understand what light is, you need to understand it through the maths. It doesn’t translate well into English.” #### Lisa Willey Absolutely! Students need to be able to verbalize their work. Many students can crunch numbers and arrive at the correct answer. However, most do not know what the number (answer) represents. There’s really no true application of math when they don’t know WHY or WHAT they were working towards. We need to consistently have students answer math questions in a complete sentence (respond to the original question or direction given). #### Sadler I’m with Greg here. I find, especially in younger students, that some mathematical concepts are understood with absolutely no hope of them explaining it verbally (until older). The language part is necessary for us teachers who want to see the maths translated as one demonstration of understanding and probably a pretty poor one. And at the other end of the spectrum, where language itself it incapable of expressing concepts (GÃ¶del?) you’d be asking an impossible task. Lisa, if by “crunching numbers” you mean following an algorithm, then I’d partially agree with you but following an algorithm isn’t necessarily blindly following. If students can competently get an answer with zero understanding then the teacher has started from the wrong place or failed to construct any understanding. #### David Wees Regarding Greg’s point, I’m quite comfortable with an explanation of a solution that draws heavily on mathematical notation. My reason for wanting to read explanations that are more language heavy is when students do not know the mathematical language they need in order to be able to describe an idea clearly. The “student-friendly” language, for lack of a better word, acts as a bridge between the dense and nearly incomprehensible language of the mathematician (to an outsider) and the everyday language that students know. So if students are writing needlessly complex language that they know how to communicate more succinctly through mathematical language AND that those more succinct explanations will be clear for everyone who may be potentially reading them, then we should allow students to write using all of the language that they know. There is another reason that we might want students to write out explanations. Dan, you remember that study about Benny and IPI mathematics? Greg, I don’t know if you have read the article or not, but it is a horrifying account of a student who has essentially invented all of their own mathematical rules for solving problems where they can get answers to problems with a high degree of accuracy (not 100% of course) but that the rules they have invented are not remotely close to the justifications a mathematician would use. If Benny had been asked to write out an explanation of why something worked, even occasionally, he would likely have exposed his very different understanding of the mathematical ideas sooner. I propose that there are actually quite a large number of Bennys out there. People who have relatively accurate but not entirely consistent ways of solving mathematical problems where the markings they put on paper resemble the expected markings if they really understood the mathematics, but because of a variety of reasons, they have invented their own rules. For Benny, he was part of an individualized programmed instruction where students were passed on to the next level if they passed the quiz with at least an 80%. In theory students were supposed to get individual tutoring if their mark was too low on any individual quiz but in practice the teacher and the tutor assigned to the class realistically had to spend nearly all of their time marking the quizzes and so students rarely actually got the individual time with a teacher, leading to Benny inventing ways to game the quizzes and move onto the next level. Another reason that we might want to listen or read a student explanation of how they solved a problem is just so, in the process of articulating their solution, students may run into their own inconsistencies in their work. I have noticed, quite often, that students will give an answer that I don’t understand, and then when I ask them to explain what they did, in the middle of their explanation they say something like, “Oh, oops! Yeah that isn’t right. I mean this instead” and revise their thinking. As for ideas being cumbersome with words, I think it is telling that you draw on the wave-particle duality of a particle. You’ll note that is a concept that students are unlikely to encounter unless they take an advanced high school physics course. Are there actually ANY math concepts from k to 12 that cannot be explained in words? That seems like a bit of a red herring argument. I agree that the mathematical notation is powerful and that I want students to be able to use its power, but I think that, at least once students have some fluency with writing, one way to do that is use the language they know and increasingly substitute the mathematical language as they understand it. As for “this is how things are explained in mathematics”, see Paul Lockhart’s Measurement for an example of how to explain interesting geometric concepts using pictures and (mostly) every day language. #### Dan Meyer I am quite insistent on my students providing explanations a call them ‘workings’ and they are the series of mathematical steps that they have followed to arrive at their answer. This is how things are explained in mathematics. Except they aren’t. Head to any paper at arxiv, even in subjects that lend themselves best to algebraic notation, like algebraic topology. Their explanations rely on non-algebraic notation like words, sentences, etc. For loads of problems, I’m with you. Ask for an explanation in the most appropriate language for the student and the math. But the language of arithmetic and algebra aren’t nearly as complete as German and English. Arithmetic and algebra are efficient in all kinds of ways. But whenever they leave me a question about what a student knows, I need an explanation in a different language. #### Susan Yet another important thing about students explaining their reasoning is that there is great self-help in a careful explanation of processes. How often have we had a student explain a problem he/she did incorrectly and, in the explanation, the student realizes the mistake without a word from us? This “out-loud-silently-in-my-head” thinking is such an important thing to help students develop. #### Brian This post resonates with something I’ve implemented recently in my class. I’ve started providing the final answers to everything question on exams. I want students to prove to me why the answer is what it is. I want them to be consumed with the question and how to arrive at the answer, not the answer itself. So far, so good. I’ve seen more proof (i.e. work) this year than I ever have. #### Denise Gaskins On of the most exciting changes I’ve seen in my decades of following math education has been the spread of number talks or counting circles. These offer an excellent opportunity for the sort of natural, informal discussions that help kids build understanding. And they give teachers a chance to model the sort of written explanation that students can use to describe their thinking, as the teacher writes notes about the different ways kids describe their answers. #### Jkern I feel like a lot of the clarity desired in showing your work comes from using units in the math to promote and communicate understanding of relationships. When I have high school chemistry students unsure about whether to take density divided by mass or times mass or mass divided by density to calculate volume, there is a fundamental problem with their understanding of the relationship between the types of measurements and their units. From grade school upward, instead of multiplying 5 people x 3 crayons/person = 15 crayons, we just accept 5ࡩ=15 as adequate to show their work. Do they know what they are multiplying, or did they just deduce that this problem gave them 2 numbers, and it’s a multiplication worksheet, so they should multiply? Most teachers at least require units on the answers, but writing units within the working math that is put down on the paper helps show students how the units guide their work. When given a novel situation involving a relationship ratio, their understanding of unit relationships, not of rote mathematical operations, will get them to the correct answer. Yet, in my experience (which may be the exception, or maybe kids just forget things), this skill isn’t taught until they are in high school science classes and can’t perform elementary math operations. #### David Coffey Mathematicians use words in describing their discoveries all the time – and have for a long time. That’s why some doctorates in mathematics require a foreign language so that the candidate can read the mathematicians’ writings in the original language. I ask students to share their thinking because it requires them to examine and clarify their thinking. It’s also helpful as an assessment and fascinating to read. Their Metacognitive Memoirs are much more informative and interesting than 30 correctly answered exercises: http://deltascape.blogspot.com/2013/05/what-are-your-thoughts.html #### Jason Dyer Just as a thought experiment, what is a.) a problem that could be justified via only algebra notation just fine? b.) a problem where in all cases algebra steps wouldn’t be enough would _require_ words to have a correct justification? It is my belief that students benefit greatly when they are expected to explain or justify their reasoning, and just as important, when they listen to each other. Placing a high value on reasoning levels the playing field and focuses learning on something other than right or wrong. Students who can sensibly reason their way through a task such as the coat problem may struggle with formal notation, and students who readily use formal notation may struggle to understand it or communicate their reasoning both students will benefit from listening to each other. It’s a win-win situation, where everybody gets to be valued as a competent contributor. #### Greg Ashman I think that this is where we differ. You look at expert performance — in this case, the writing of academic mathematics papers – and you infer something about the teaching of high school maths. On the other hand, I am sceptical that novices attain expertise by simply emulating the behaviour of experts. Paul Kirschner describes this as confusing epistemology — how the field of knowledge advances — with pedagogy — how we teach novices what is already well established. Essentially, I see this as fallacious and one of the ideas that drives constructivist educators towards less effective teaching methods. It is quite clear that experts and novices differ in many significant ways. “How People Learn” — of which I am not a fan — actually contains good descriptions of some of the key experiments that demonstrate these differences. I would personally recommend Chapter 6 from Dan Willingham’s “Why don’t students like school” which addresses the expert/novice confusion issue directly. In academic maths papers, an author is often setting out to explore new and sometimes contested domains. They will need to explain their reasoning and spend time outlining definitions. Sometimes, new notation and formalisms will be required. On the other hand, I can think of no standard high school maths problem to which the solution cannot be fully ‘explained’ using well-known, often centuries-old formal steps. A teacher will recognise these steps and be able to deduce whether the reasoning is correct. Different ways of representing such a solution only become necessary if you value students deriving their own suboptimal methods i.e. it is a logical consequence of a discovery learning approach. And discovery learning is ineffective. #### Dan Meyer Jason Dyer, useful thought experiment. I added it to a new post. You look at expert performance — in this case, the writing of academic mathematics papers — and you infer something about the teaching of high school maths. This is a relevant critique in some other discussion. I’m only looking at expert performance in this one because both you and Barry have both made claims about it that are unsubstantiated. On the other hand, I can think of no standard high school maths problem to which the solution cannot be fully ‘explained’ using well-known, often centuries-old formal steps. Too many qualifiers for me to respond. Your definition of “standard” is known only to you. This is a stronger claim than I’ve ever seen you make, FWIW. Clearly, you don’t think problems are a useful vehicle for learning new knowledge. I didn’t realize you saw them as useless — period — even in the application of existing knowledge. If I understand you correctly, there is simply no place for a task like — picking one at random — the pool border problem — at any phase in instruction, early or late. #### Greg Ashman Dan, Please state the unsubstantiated claims that I have made about expert performance. Thanks. I have not claimed that problems are useless. I have simply claimed that the solutions can be written using well-known, formal steps. For “standard” think of typical textbook questions. #### Ze'ev Wurman 1) Dan argues that “Math answers aren’t math understanding any more than the destination of your car trip indicates the route you took. When five people arrive at the same destination, asking how each arrived tells you vastly more about the city, its traffic patterns, and the drivers, than just knowing they arrived.” I will argue that if you arrive to your destinations time after time without ever being lost it may not tell people much much about the city or its traffic patterns, but it does clearly tell people that know your way around the city. I thought the purpose of a problem in a classroom is to check whether a student knows sufficient math to solve it, rather than learn bout the nature of human thinking processes. If it is the latter, Dan is completely right, except it belongs to cognitive science experiment rather than a classroom. 2) Dan comments that “Head to any paper at arxiv, even in subjects that lend themselves best to algebraic notation, like algebraic topology. Their explanations rely on non-algebraic notation like words, sentences, etc.” True for *most* papers. The reason is twofold, that (a) using words and sentences serves as scaffolding for most readers who cannot actually handle the math, and (b) many articles attempt to draw conclusion from the math that apply to non mathematical domains (e.g., policy or future directions recommendations) for which technical language is not necessarily helpful. But I hope nobody kids himself that voluminous verbiage serves the student who can already do the math. At best, it imposes verbiage on the students to make the teacher happy. At worst, it serves to obfuscate the lack of math knowledge by spouting a lot of unnecessary and meaningless words and hoping to get some partial credit. #### Brett Gilland There is an interesting philosophy concept called ‘Philosophical Zombies’ which seems to be relevant here: https://en.wikipedia.org/wiki/Philosophical_zombie I think far too many of us have encountered ‘mathematical zombies’- students who can reproduce all the steps of a problem while failing to evidence any understanding of why or how their procedures work- to have a ton of faith in the claim that correct procedures is necessarily equivalent to correct understanding. David Wees gives an excellent example of one such instance above. Asking for full explanations from students, in addition to accurate calculations, has the double advantage of catching many conceptual misunderstandings early so that they can be fully addressed and encouraging students to consider math as a meaning-full subject as opposed to a long list of procedures to apply in various contexts. Constructivist teaching techniques may be less efficient in producing procedural fluency on a given schedule, but they are often more effective at producing students who have developed a solid conceptual grounding in mathematics. #### Dan Meyer Please state the unsubstantiated claims that I have made about expert performance. I’m referring to this claim from your first comment: “This is how things are explained in mathematics.” Maybe you meant to restrict “mathematics” to “standard textbook mathematics problems.” I took your comment at face value. I have simply claimed that the solutions can be written using well-known, formal steps. Symbolic steps? Verbal steps? “Formal” adds a new dimension to a conversation that’s already suffering for precision. But I hope nobody kids himself that voluminous verbiage serves the student who can already do the math. You’re the only person here who knows the parameters on “voluminous.” As I wrote in my follow-up, Garelick’s claim is similarly underspecified. Is anybody here claiming that symbolic (which is to say, arithmetic or algebraic) explanations are always sufficient as a gauge for student understanding? That’s been my impression so far, and I’d be happy to be corrected. If verbal expressions are sometimes useful, I’m interested in when they are useful. Do any of my traditionalist colleagues have an opinion there? Barry & Ze’ev are both pointing at a stupid graphic organizer saying, “That isn’t useful.” And I don’t disagree with either of them. It just isn’t a very helpful point. #### Brett Gilland “I thought the purpose of a problem in a classroom is to check whether a student knows sufficient math to solve it, rather than learn bout the nature of human thinking processes.” I can not disagree with this enough. The purpose of a problem in my classroom is almost always to understand the nature of that human’s thinking processes. This allows for amplification, further investigation into how the student is able to navigate similar problems with subtle variations and complications, and attempts to draw student mental models into internal conflict to create pressure for remediation and revision of said mental models. I suspect that I am not alone in this view, which is why you see many of us arguing that explaining one’s thinking is important while simultaneously arguing that strict adherence to formal structures often hinders this process instead of helping. #### Greg Ashman My comments were not specifically about expert performance they were about maths in general. You seem to interpret them in them in terms of expert performance which I find interesting given the expert/novice fallacy. Even the academic papers to which you refer make use of formal, symbolic logic. It would be strange if they did not. I doubt that many people would be impressed with a formal mathematical proof expressed in prose. I don’t think *anything* is always sufficient to gauge student understanding. That’s the point. This is a fool’s errand. Student understanding is a *latent* property that we can only attempt to infer by proxy. Just as we could perhaps mindlessly train students to reproduce symbolic procedures, we could mindlessly train them to draw silly diagrams and write prose. In this case, neither would demonstrate understanding but at least the former would not be a mathematical dead-end. #### Michael Pershan These conversations always bewilder me. They take place waaaay up in the stratosphere of abstraction. They remind me most of dorm room debates about morality and religion. That is, very, very abstract and unconstrained by the rigors of any discipline that might provide structure, rigor or checks on the arguments. Here’s what I wish for, whenever one of these arguments bust out between traditionalists and progressives.* 1) Can the traditionalist and progressives find a lesson/activity/short video that they both agree is lovely teaching. 2) Same thing, but they both agree it’s lousy teaching. 3) Can each identify a whatever that they like, but their sparring partner doesn’t. Can they explain why. 4) If the disagreement persists, can they explain why they think it does? * I refuse to use the term educationist. It sounds stupid and doesn’t describe anything. #### Elizabeth (@cheesemonkeysf) What’s interesting to me is what actual students say about the value – or lack of value – in speaking mathematics in class. In addition to providing multiple measures I can use on the fly to assess the general state of understanding and/or learning, talking about mathematics has benefits for a wide range of students. Three recent quotes from my students may be illustrative here of how some students experience the power of verbal communication about mathematics: “…I really benefited from the amount of communication we focused on in our class. Talking Points helped me understand what we were learning about and about my classmates’ brain spaces. Understanding my classmates more allowed me to understand math more just because I understood how their brains worked through problems.” “Before this class I never communicated in math the way we did: by sharing our ideas, and connections we made without the judgments I usually associate with math classes. I have never been the strongest math student and as a result, while I do talk in class it is not without fear that there are eye rolls throughout the class…. In your class I saw the value of community between mathematicians and I hadn’t ever seen that before. It made me see mathematicians together deep in discussion rather than shut in a room suffering by themselves. And yes that is a bad image but your class changed that.” “Initially, I was a little daunted with being tasked to communicate verbally about math. Math makes sense on paper, that’s just the way we were taught. We didn’t speak math. In Geometry, we had to explain proofs but that was the extent of my knowledge on mathematical communication. However, I did not do as badly as expected. I got used to talking about math instead of just solving it. I remember Talking Points. Those were the main reason that eased my transition from writing math to talking about math.” As Henri Picciotto always says, no one approach or tactic works for all learners all of the time. So why not expand our own repertoires in classroom practice? #### Michael Paul Goldenberg Katherine Beals , according to the description on her website, has three “left-brain” children (her terminology). She’s written a book on RAISING A LEFT-BRAIN CHILD IN A RIGHT-BRAIN WORLD. I think a perusal of that web site or a look at her book would be useful in ascertaining where her views about mathematics education are coming from. Many people argue, incorrectly in my view, that student who can compute correctly (or at somewhat higher grade levels, solve algebra and other secondary math problems correctly) do not need to explain anything about their thought process. Further, there is a point of view that it is unfairly punitive to ask or expect these students to put anything in writing about how they arrived at their answers. But for my money, at the K-5 level especially (but not exclusively), some degree of explanation beyond writing math symbols is necessary for the vast majority of students to gain the sort of mastery. Focusing on only a special group and then expecting that everyone else MUST follow suit is unreasonable. Basing a national approach to mathematics education on such foundations is almost certainly continuing practices that have not served a majority of our students at all well. I don’t disagree that some teachers, particularly those who don’t understand the point of getting students to explain or whose own grasp of mathematics may be such that they don’t feel comfortable explaining their own process, may undermine the explanatory aspects of math pedagogy. And others, well-meaning but also ultimately harmful, may for various reasons wind up being overly-rigid in how they address the issue of explanations with some or all of their students. In this regard, it’s important for teachers to recognize that there are MANY ways to explain mathematics other than purely symbolically or mostly in words. And for students at various developmental levels, demanding/expecting written or oral explanations that are predominantly or exclusively verbal can be inappropriate or unreasonably frustrating for a given student. Just like using manipulatives, assigning “real-world” problems, involving technology in lessons and assignments, and much else that has been debated in mathematics education over the last 25-30 years, this issue has been unnecessarily obscured by ways in which some people utterly reject the idea of mandatory explanations out of hand, and others apply the idea insensitively and inflexibly. No amount of reasonable discourse will satisfy everyone. It is, however, necessary to try to offer sound arguments and explanations for this practice. At the same time, teachers and those who train, employ, supervise, coach, or otherwise work with them professionally cannot wait until there is universal agreement on a practice before educating instructors about it and helping them effectively implement it with their students. #### Ze'ev Wurman Brett Gilland, in response to my point that problems are to determine student knowledge rather than study the nature of human thinking processes, writes: “I can not disagree with this enough. The purpose of a problem in my classroom is almost always to understand the nature of that human’s thinking processes. This allows for amplification, further investigation into how the student is able to navigate similar problems with subtle variations and complications, and attempts to draw student mental models into internal conflict to create pressure for remediation and revision of said mental models.” Well, I disagree. I suspect that Gillands employer, and certainly the parents of his students, would also disagree. Some quite strongly. The primary purpose of school is to educate the kids at hand, not to train the teacher. This doesn’t mean that teachers do not learn from experience, but if gaining experience and insight is the primary reason for what the teacher does, he’d better get approval from an IRB and a waiver from each individual student or parent that attend his class. There is nothing wrong with having a student occasionally explain his thinking when s/he is called on in the class. There is a lot of wrong with having students explain their thinking in a voluminous verbal/written manner on every test item and every home assignment. Which brings me to Dan who argues that my definition of “voluminous” is under-specified. So here is a specification for you: If an average competent math teacher — not a generalist K-8, but a teacher with math certification of some kind — can easily follow what the student *mathematically* did, then it requires absolutely no additional verbiage. If a student uses verbiage to indicate one or more steps where s/he cannot or does not use mathematical symbolism to indicate such steps, that verbiage is not superfluous. In other words, any place that has BOTH proper mathematical symbolism and narrative, one — typically the latter — tends to be “voluminous” (and ill-advised) verbiage. The patent office uses this type of definition, as do regular courts. And to answer Dan’s last question whether symbolic explanations are always sufficient as a gauge for student understanding, my answer for K-12 math is overwhelmingly yes. It may be that student is yet unfamiliar with particular symbolism helpful for the problem at hand and then a few words can replace the missing symbolism particularly in early grades, but I consider insisting of verbalism (smile) in a math class, when symbolism is accessible, or duplicates the already-written symbolism, touching on malpractice. #### Tracy Zager I know the “traditionalists” discount the idea of turning to mathematicians to see how they communicate, but I don’t. I find it quite instructive. A few favorites. Ian Stewart: “A proof, they tell us, is a finite sequence of logical deductions that begins with either axioms or previously proved results and leads to a conclusion, known as a theorem….This definition of ‘proof’ is all very well, but it is rather like defining a symphony as ‘a sequence of notes of varying pitch and duration, beginning with the first note and ending with the last.’ Something is missing. Moreover, hardly anybody ever writes a proof the way the logic books describe….A proof is a story. It is a story told by mathematicians to mathematicians, expressed in their common language….If a proof is a story, then a memorable proof must tell a ripping yarn….When I can really feel the power of a mathematical storyline, something happens in my mind that I can never forget” (2006, 89-94). Marcus du Sautoy: “A successful proof is like a set of signposts that allow all subsequent mathematicians to make the same journey. Readers of the proof will experience the same exciting realization as its author that this path allows them to reach the distant peak. Very often a proof will not seek to dot every i and cross every t, just as a story does not present every detail of a character’s life. It is a description of the journey and not necessarily the re-enactment of every step. The arguments that mathematicians provide as proofs are designed to create a rush in the mind of the reader” (2015). Paul Lockhart: “A proof, that is, a mathematical argument, is a work of fiction, a poem. Its goal is to satisfy. A beautiful proof should explain, and it should explain clearly, deeply, and elegantly. A well-written, well-crafted argument should feel like a splash of cool water, and be a beacon of light—it should refresh the spirit and illuminate the mind. And it should be /charming/” (2009, 68). “A proof should be an epiphany from the gods, not a coded message from the Pentagon” (75). Paul Halmos: “The best seminar I ever belonged to consisted of Allen Shields and me. We met one afternoon a week for about two hours. We did not prepare for the meetings and we certainly did not lecture at each other. We were interested in similar things, we got along well, and each of us liked to explain his thoughts and found the other a sympathetic and intelligent listener. We would exchange the elementary puzzles we heard during the week, the crazy questions we were asked in class, the half-baked problems that popped into our heads, the vague ideas for solving last week’s problems that occurred to us, the illuminating problems we heard at other seminars—we would shout excitedly, or stare together at the blackboard in bewildered silence—and, whatever we did we both learned a lot from each other during the year the seminar lasted, and we both enjoyed it.” (1985, 72-73). Ian Stewart: “When two members of the Arts Faculty argue, they may find it impossible to reach a resolution. When two mathematicians argue—and they do, often in a highly emotional and aggressive way—suddenly one will stop, and say, ‘I’m sorry, you’re quite right, now I see my mistake.’ And they will go off and have lunch together, the best of friends” (2006, 28). If only this crowd had been there to tell these mathematicians to cease and desist with all this needless talking, arguing, explaining, listening, and charming each other! How much time could have been saved! And how many “needless” ripping yarns, poems, well-written, well-crafted arguments, and rushes in the minds we could have skipped! #### Michael Paul Goldenberg @Tracy Zager: Excellent comment. #### Brett Gilland “I suspect that Gillands employer, and certainly the parents of his students, would also disagree. Some quite strongly. The primary purpose of school is to educate the kids at hand, not to train the teacher. This doesn’t mean that teachers do not learn from experience, but if gaining experience and insight is the primary reason for what the teacher does, he’d better get approval from an IRB and a waiver from each individual student or parent that attend his class.” Funny thing, that. My employer, my parents, my students, my district, the state evaluator for my school, etc. all support my teaching. Most quite strongly. This might be due to the fact that when most people hear “I work really hard to understand your child’s thought processes so that I can better guide their thinking and draw out subtleties and conflicting mental models,” they don’t think “Oh my God, that man is performing experiments on my child to improve his educational practice.” Instead, they think “Oh my God, that man really cares about what is going on inside my child’s head and is attempting to tailor instruction to what he finds there. Thank goodness he isn’t stuck with a teacher who believes that teaching is just lectures interspersed with quizzes to determine if my child gets it or needs to be droned at more with another utterly useless generic explanation that takes no account of what my particular child is thinking!” Honestly, if you weren’t such a well known advocate for traditionalist instruction, I would assume that you were trolling me with this comment, as it displays either an utter failure at reading comprehension, or a self-parodying failure to incorporate even basic pedagogical techniques (such as asking questions to investigate what a student is thinking). If I was attempting to straw man traditionalist instruction, I would trot out this line, but I would worry that it was a little over-the-top to be believable. #### Tempe I agree with the author,Greg & Ze’ev on this question. Maths is not English and has its own language, which we should encourage students studying maths to be familiar with and use ie symbols etc. Word problems are fine but if the student executes the right algorithm or algorithms, applys the correct formula or procedure to arrive at the correct answer than they ARE showing their working and conceptual understanding and shouldn’t be forced to explain it in words. Frankly, at times this is near impossible, especially for the novice. I wonder how you would explain in words, when you are 6, why 1 plus 1 equals 2? What is a satisfactory answer and what isn’t? #### David Wees I can think of some cases where the symbols alone don’t give sufficient insight into the thinking that went into forming them. Eg. Reproduction of the symbols does not result in the child in question actually being able to connect the mathematical ideas later. I was working with a pair of children and I learned from my work with them that they understood that there are ten thousandths in a hundredth and ten hundredths in a tenth. They were able to say that 1.8 and 1.80 are equal because 8 tenths and 80 one hundredths are equal. They knew that 1.207 is less than 1.40 because 207 thousandths is smaller than 400 thousandths. They seemed to have a very clear understanding of decimals at least up until the thousandths position. I then asked the child how much money$1.207 is worth. They told me that 207 is two hundred and seven cents and that 100 cents make up a dollar and that means $1.207 is the same as three dollars and seven cents. I asked them to write that down, which resulted in the statement$1.207 = 3.07. If I don’t push for at least the occasional explanation, then this way of thinking doesn’t emerge, and this child is likely to wander off with an incomplete understanding of decimals. As for statements like this, “There is nothing wrong with having a student occasionally explain his thinking when s/he is called on in the class. There is a lot of wrong with having students explain their thinking in a voluminous verbal/written manner on every test item and every home assignment.” This is an appeal to extremes. Of course we don’t want kids write volumes of text for every assignment they do but from this you are concluding that only occasional writing or verbal explanation is acceptable? As for discovery learning being ineffective, this is actually another example of taking an extreme position and using it to argue against more moderate positions. Kirschner, Sweller, and Clark (2006) reported that there is little evidence to support pure discovery learning, and then grouped together a bunch of things that no pure discovery ideologue would consider pure discovery learning and said that none of these approaches work. Here’s a critique of their paper which took 5 minutes to find and actually cites some studies that suggest that constructivist approaches are not as terrible as is claimed repeatedly. I don’t advocate for pure discovery learning. I’m in agreement that it is probably not effective at meeting the goal of children learning specific facts and skills. I argue for a more moderate position that there are portions of teaching (or teaching practices) in which mathematical ideas should be made explicit to children but that much more attention needs to be paid to the internal models children actually develop as they understand mathematics. It fascinates me that cognitive load theorists offer schema as a way that knowledge is stored and retrieved but seem to pay no attention at all to what those schema actually are and how they interact with each other. #### David Griswold “I wonder how you would explain in words, when you are 6, why 1 plus 1 equals 2? ” *student holds up one finger* *student holds up a second finger* “one and one makes… one, two!” No need to be six. My 3-year old does this regularly. This conversation demonstrates an understanding of what “plus” means in the context of joining a group of numbers- an understanding I think almost every elementary student has, thanks to early counting-on-their-fingers conversations like this that nobody finds odd. Another reasonable explanation: “Put the numbers on the paper like this.” *Writes 1 2 3…* “Plus means to move. I start at 1, I move 1. That’s 2!” My three-year old has not moved on to this mathematically equivalent but mentally different conception of addition yet, but as we play games like “Robot Turtles,” I can see him heading in the direction of understanding addition-as-movement. But for some reason, the instant we move on to math that three-year-olds really can’t easily grasp – multiplication, for example – the idea that we might want to instill and assess this sort of understanding becomes fraught with controversy. There are many useful applications and models for the abstract concept of multiplication. Area models, array models, scaling models. They are all mathematically equivalent, all represented by the same mathematical symbol, but which multiplication model is inside the head of the student performing a problem matters. The array method is going to fail (or become tricky) when you encounter fractions. The area model is going to fail (or become tricky) when you encounter negative numbers. The scaling model will work well until you get to complex numbers, in which case you need to adapt the model. OR, and I think more commonly, you simply stop trying to understand through a model at all. You just treat the new thing as a new arbitrary rule. “It is not for me to wonder why, only flip and multiply.” If you think of K-12 math as nothing but a series of problem types to be mastered, boxes to be checked, then a broken model (but correct symbols) can get many students there. But I doubt any teachers think of math that way for themselves. When we, as mathematicians and math teachers, read a series of abstract symbols we imbue them with meaning and story. But that is not a skill that the symbols themselves gave us. The imbuing is the hard part. And the important part. And it needs assessing. #### Alicia I ask my students to tell me the story of how they solved the problem. This prompt invites reflection and detail. I want them to explain their thinking, show me their understanding, and tell me how they solved it. They can use words, symbols, pictures, tables, poetry, videos, audio clips, the ShowMe app. It takes effort and practice, but the result is worth it. I get to know my students in a different way. I know how they communicate and I can push them deeper into mathematical communication and understanding. I learn who they are as learners which helps me in my planning. “Tell me the story of how you solved” is a powerful assessment tool. Try it! #### Michael Paul Goldenberg Nice suggestion, Alicia, one I’ll be passing along to the teachers I coach. #### Education realist I don’t ask kids to explain their work. Too much hassle and most kids confuse “I put down words” with “I explained”. “I will argue that if you arrive to your destinations time after time without ever being lost it may not tell people much much about the city or its traffic patterns, but it does clearly tell people that know your way around the city. ” Well, no. It used to. But then you start running into people who, when given a destination, find a map and painstakingly commit the specific route to memory. Then, once they’ve found their destination, they immediately forget that route. This used to be an absurd thing to do, and most people would never take the time and trouble, so it was a safe assumption that everyone who got to their destination knew their way around the city. No more. And “math zombies” is a great term. Examples I see constantly: 1) Kid can find a derivative using the chain rule flawlessly, but has no idea why the derivative of a line is a horizontal line, or why the derivative of a quadratic is a line. None. Zip. Zilch. They don’t even understand the question. “Well, what line? Give me a line, I’ll find the derivative.” 2) Kid can graph all trig functions from memory but has no idea why the tangent function has an asymptote, much less why the tangent and cotangent functions have asymptotes at different values. “Because that’s what the graph does.” 3) Kid is given a graphed quadratic with zeros clearly identified, vertex specifically on a point (but not identified) and a (stretch) clearly visible and, asked to put it in vertex form. Uses the zeros, sets a=1, distributes, uses standard form to complete the square. Gets it wrong because, of course, he set a=1. But also, in map terms, reveals he only knows how to get to the London Bridge using one specific path, incorrectly, rather than the obvious short cut that people who “know their way around the city” would instantly take. When told of his error, is upset because he did the best he could “without knowing a”. 4) Kid working coat problem has no idea what 100% represents, much less 80%. Doesn’t know why he goes from 80% to .8 other than “that’s what I was told to do”. Can’t explain why he doesn’t just increase 160 by 20% or why that’s the wrong thing to do. And, given a problem in another form, might just do the equivalent of increase 60 by 20% because he doesn’t understand why it wouldn’t work. Like I said, I don’t ask kids to explain their work. But I do ask test questions that ferret out zombies. And more than one zombie has failed my class despite a perfect record of As from teachers who consider accurately worked math a proxy for understanding. It should be. I understand this. But it often isn’t these days. And I agree with Michael Pershan I really think most traditionalists would not be able to find fault with all but the most determinedly reform class. #### Christian One thing which is often forgotten when it comes to more language is that there in a way is double jeopardy: not only are weaker readers disadvantaged for reading but also for maths because of ‘wordy’ assessment items (see the TIMSS year 4 relationship report). There also is a strong link with SES. #### Education realist ” I wonder how you would explain in words, when you are 6, why 1 plus 1 equals 2? What is a satisfactory answer and what isn’t?” First, I am skeptical of the “explain your thinking” approach. That said, it’s pretty obvious that the questions are designed to get kids to understand the number line. So a simple, correct answer for “explain your thinking” for 1+1=2 is the number line, with 1 marked, 1 added, and 2 identified. Or the words “because 2 is exactly 1 away from 1 on the numberline.” Likewise, the weird grouping stuff is an attempt to get kids to understand place value. The underlying objective is to focus all math around the numberline, and to instill in all kids an understanding of the meaning of place value. This is something math professors have moaned about for decades, that kids don’t understand what “=” means and that they don’t grok place value. I’m not saying I agree with it. And certainly, if that’s what they are teaching, then it’s easy enough to test without open-ended “explain your thinking”. Or maybe it isn’t, because zombies. I dunno. Not a fan. But it’s not hard to understand the objective. #### Tim Hartman So…explaining your math isn’t about proving to your teacher that you can do the problem. It’s about metacognition! You can easily understand procedures and get problems correct without any explanation. But if you’re able to think about your thought process and figure out what was going on in your brain to get you to that right answer, you will not only have a deeper understanding but you’ll become a better problem solver allowing you to use those smaller concepts and ideas in future problems that can’t be solved with the steps you memorized in class. If the only reason for math class is to learn procedural knowledge for problems that only show up on math tests, then sure, ditch the explanations. But if we want to teach conceptual knowledge and problem-solving skills, explanations are crucial for lots of kinds of problems. Plus, the number one marketable skill that people can get out of a math class is communication – being able to communicate complex ideas and convincing solutions to problems. I bet the authors are the type of people that brag about how bad at math they are. [shudders] #### Edmund Harriss Most of my experience comes from dealing with university students, but I have worked both with math majors and students in Quantitative Literacy (an alternative to college algebra), both in groups of students 20-30, where I feel I can see individuals as well as the group. In both cases I see issues with the “just get the right answer approach” from school. For the strong high school students (which makes up most of the math major community) their use of formal symbols in their working is itself very poor, and does not respect the meaning of those symbols. A classic example is using the = sign to mean “and the next step is”. Worse when I need to break the rules they have learnt (and every mathematical rule can get broken for some system) the result is confusion and resentment, rather than the joy at new understanding (and breaking rules) that students who go beyond the methods show. For the weaker students many do not seem to have arrived at the first step. Many years of math instruction have left them with some ability to move symbols around but no consistency in the meaning from line to line. This is understandable as they don’t seem to feel that mathematical concepts have meaning. A competing issue is the fact that the goal of mathematics is to process ideas reliably without constantly checking the meaning. As a result a computer can do mathematics. This is the wonder of the subject, but does create problems! I do believe that a lot of understanding does come from mechanical manipulation, and that fluency there is a very important goal. Yet if it is the only goal then only a lucky few will gain understanding that they are also doing more. #### Brett Gilland “I don’t ask kids to explain their work. Too much hassle and most kids confuse “I put down words” with “I explained”.” ” not only are weaker readers disadvantaged for reading but also for maths because of ‘wordy’ assessment items” I suspect that I agree with both of these authors more than I disagree. However, both these quotes seem to suggest a ‘they are bad at it so I don’t bother teaching it’ approach that seems a bit backwards. Maybe the opportunity cost is too great in er’s eyes. I get that. But when you point out that kids think putting down words is equivalent to explanations, it seems like time to pull out the red (or green, if that is how you roll) pen and start addressing the difference between words and quality explanations. Similarly, while I agree with the examples given by Christian- explain your thinking in high stakes tests is often worse than useless, especially when graded via craigslist interns- I worry that we see disadvantaged students with language issues and then crib away from those language issues in favor of ‘just teaching maths’, we are doing them a massive disservice. Our math department recently (3 years ago) switched to a much more reading and writing intensive math program and not only have our math scores gone up, so have our English scores. That is a win-win, and it only really happens by math teachers accepting that they have a role in teaching quality writing and argumentation in mathematics class, in addition to what the English teachers are doing in that regard. “Plus, the number one marketable skill that people can get out of a math class is communication — being able to communicate complex ideas and convincing solutions to problems.” So much this. Not just marketable, useful in general. High quality non-fiction communication is extraordinarily rare and supremely useful. “I bet the authors are the type of people that brag about how bad at math they are. [shudders]” I actually assume the opposite, that they are both highly successful mathematicians that believe that math worked for them so there is no need for reform. They communicate mathematically quite well and picked it up intuitively after learning enough procedures, so they assume that this is the most efficient and effective way to teach others mathematical communication. Selection bias seems to be disregarded by this set. Metacognition! Thank you, Tim Hartman and David Coffey. I can’t imagine anyone not seeing the value of that, but you never know. I found some things to think about in the original article. It is an important question to ask. I’d offer up my own anecdotal experience. I teach a class titled Finite Math in high school that includes a unit on the history of counting systems. We get into counting systems with bases other than 10 which leads into math in bases other than 10. Every year they have difficulty with the arithmetic in base 8 or 5 or 16 (Thanks Andy Weil for writing The Martian and giving a great example of hexidecimal!) and we have to return to base 10 and what they “know” to figure it out. Every year there is an “a-ha” moment when we delve into what we are really doing when we “carry” in addition or “borrow” in subtraction. For the first time, for many, they understand what they had been taught to do. Yes, they had the procedure. But they didn’t understand and the lack of understanding was holding them back. And if it can happen at the arithmetic level, what about more complicated concepts? #### Katharine Beals Let’s look at other countries that are outperforming us in math. Are they requiring the sort of cumbersome, verbose explanations described in the article? No: while American students are busy writing up verbal explanations and diagramming their thought processes and belaboring multiple solutions with problems that are comparatively easy in terms of actual math, students in other countries are moving ahead with conceptually challenging mathematics. These students–from Finland to Japan–are doing the sorts of problems that you can’t do unless you understand the underlying math. Compare the level of actual mathematics required in the mostly multiple choice test questions on our Common Core inspired high school exit exams (many of which, yes, can be done without deep mathematical understanding) with the sorts of problems that high school seniors are tested on in those countries that outperform us in math. Here are is a link to a page with sample problems from the National Matriculation Examination of Finland: http://oilf.blogspot.com/2014/… . And here is a link to a page with sample problems from the Chinese Gao Kao: http://oilf.blogspot.com/2015/… The American approach is to build conceptual understanding through time-consuming student-centered discovery of multiple solutions and explanations of relatively simple problems. An internationally more successful approach is to build conceptual understanding through teacher-directed instruction and individualized practice in challenging math problems. It would be interesting to see how American students would do on the Finnish and Chinese exams as compared with their Finnish and Chinese peers. #### Barry Garelick “I actually assume the opposite, that they are both highly successful mathematicians that believe that math worked for them so there is no need for reform. They communicate mathematically quite well and picked it up intuitively after learning enough procedures, so they assume that this is the most efficient and effective way to teach others mathematical communication. Selection bias seems to be disregarded by this set.” I did in fact major in math (though I am not a mathematician). But in elementary school I was not yet a math major and received mostly C’s in arithmetic and was poor in aspects of it. It was not as intuitive for me as you assert, but I was able to build on procedural knowledge to gain understanding as I progressed through school. I would not have been able to provide good explanations of how I solved problems in lower grades, though as I learned more in high school, I likely could. The accusation of selection bias is just that–an accusation. Our article included links to referenced peer-reviewed papers that show that procedural fluency and conceptual understanding work in tandem. We do not negate the importance of conceptual understanding in teaching math. The math I learned in elementary school included the conceptual underpinning. I was not taught in the “rote” fashion as math teaching of that era is often mischaracterized. I appreciate the interest in the article and the conversation on this blog. #### Brett Gilland As you point out in your discussion of the Finland model (http://oilf.blogspot.com/2014/10/the-finnish-fallacy-drawing-wrong.html), people often see what they want to see when they evaluate other countries to see what they do correctly. You apparently feel so confident in your analysis of what Finland does right that you are willing to overrule the former director of the Finnish Ministry of Education and Culture. This is a problem for both sides of these debates, and comes from failing to analyze the data set as a whole and instead simple trying to cherry pick conclusions from those who do well. This introduces survivorship bias and allows us to project whatever positive traits we wish to emphasize without acknowledging all of the counterexamples extant in the original data set. Also, while I would be interested to see American education move in this direction, this is still far from being an accurate description of most of American mathematics education: ‘The American approach is to build conceptual understanding through time-consuming student-centered discovery of multiple solutions and explanations of relatively simple problems.’ There are too many elementary teachers who have no solid conceptual grasp of mathematics to actually lead exploratory lessons. Most still teach lessons via lecture and using only one technique, because that is the only one they understand/trust to get reliable results. And most HS teachers (and college teachers) are still massively lecture based with large problem sets, because that is how they were taught and what they know. I know that we all like to pretend that progressives have taken over mathematics education in the united states- progressives because it makes us feel less hopeless and traditionalists because it gives them something to rail against), but it just isn’t true and has never really been true. So comparing America’s ‘constructivist’ mathematics education to the systems in other countries is about as effective as comparing American unicorns to other countries’ horses. #### Brett Gilland I should be clear. When I noted my assumption, I really did mean that as an unsupported but suspected claim. It doesn’t really color the rest of my arguments because I realize it is without basis. Thanks for the clarification of your background. I do find such biographical details quite interesting. I also wanted to clarify a claim I made that you attempted to refute while actually, I believe, confirming the original claim. “It was not as intuitive for me as you assert, but I was able to build on procedural knowledge to gain understanding as I progressed through school.” When I assert that you picked up mathematical communication intuitively, I do not mean that it was easy or natural. I mean that your increased procedural knowledge led you to increased abilities to communicate your mathematical knowledge without explicit instruction. What several of us have addressed in this thread is that many, if not most, students do not actually acquire skills in mathematical communication without explicit instruction and revision. This seems to be a problem that you did not experience and do not address. #### Larry Sizemore At a recent PD for grade 3 teachers, I lamented the fact that PARCC assessments do not have a means for students to convey understanding other than using words and clumsy equation editors. This has the unintended consequence of pushing teachers towards delegating increasing amounts of class time to helping students to develop written explanations. It is difficult as a district, to communicate the significance of encouraging various representations of understanding and process, when the high stakes test that is the basket into which all of the eggs have been placed, is getting it all wrong in such a glaringly obvious way. #### Dee Crescitelli I ask students to explain their thinking for multiple reasons… Sometimes they have misconceptions that need to be cleared up. Sometimes they did something really interesting, that I want other students to see and hear about. …and sometimes it is because I know that students can type 75% of the problems we give them into some website and get the answer and the steps for solving- no thinking required. While that may be useful for the short-term, I want to arm students for being able to USE the mathematics as well. To come full circle with this, many of the students in our high schools and colleges will grow up to be teachers… Where they will have to be able explain their subject matter. If they are math teachers, then they will have to be able to lead conversations about all the aspects of number, and about how numbers are structured and why things work the way they do in base 10, and properties, and… etc… Count me in the YES to having students explain their thinking as an excellent teaching and understanding tool camp. #### Education realist ” However, both these quotes seem to suggest a ‘they are bad at it so I don’t bother teaching it’ approach that seems a bit backwards.” It’s pretty clear that I do bother teaching the “why”, in my previous comment. However, I don’t say anything like “show your thinking”. What I do instead is ask questions that the kids will only understand if they understand the concepts. Brett is completely correct in the last two paragraphs of #49. Reform math, as it’s called, is very popular in elementary school because teachers prefer it. But it’s made no headway in high school, or does for only a limited time, because high school teachers mostly despise it. Overwhelmingly, math teachers in high school teach exactly the way Barry and Katherine want it done–problem sets and lectures. And they lose millions of kids completely, while giving As to a lot of kids who don’t understand a thing. I’m not a fan of reform. I know kids who can do math in their heads, understanding concepts perfectly, and I’m fine with that. I don’t much care for discovery. But I’ve spent too much time getting squicked out by automatons who don’t understand a thing. #### Ze'ev Wurman Many commenters argue that verbal explanations are important. I doubt anyone denies that. Explanation are important in one on one setting, in presentation by the student to the class, or when s/he is called upon by the teacher in class. No sane person would dispute that. The *abuse* of explanations that the Atlantic piece tackled, and that I commented on, occurs when explanations are required for essentially every — even the most trivial — homework and test items, or when “verbalism” is *demanded* even when proper clear symbolism has already been used by the student. This is what seems to occur in Common Core testing and in many Common Core classrooms and what I think Beals and Garelick — and I — find objectionable and counterproductive. #### Michael Paul Goldenberg And no thoughtful, reflective mathematics educator advocates the sort of rigid application of “explain your thinking” to every problem, particularly those which are trivial for students at a particular developmental and grade level. IF you’re teaching a classroom of students where you have adequate evidence that for those students the problems you’re asking for explanations on are trivial. If not, then good teachers are obligated to try to probe where students are having difficulties, confusion, etc. And one way that proves effective is asking for explanations. But of course, there can be difficulties brought on by “The System.” If I want information that allows me to do formative assessment – to the benefit of my students and my own ability to more effectively teach them – I need both the time to write the individual comments that most experts in formative assessment believe are essential, and the sort of school culture that allows non-graded formative assessments to be taken seriously by all stakeholders: students, parents, administrators, and faculty. Without that, formative assessment doesn’t work. So then I am forced to GRADE the explanations students offer – can’t get the kids to take them seriously if they’re not graded in that case. And then I have very upset students and parents who don’t see why a kid who gets that “right answer” should have to explain anything at all. There are many excellent reasons for teachers to ask for students to explain their thinking, and many benefits for students who learn how to reflect on their own problem-solving strategies and practices. But none of this will work in places where everything has to be graded and that which isn’t graded is trivialized. When numerical/letter grades are king, real learning is kicked to the curb, along with meaningful assessment. We can focus solely on those teachers who mechanically demand explanations on everything. We can focus on the Common Core and high stakes testing which drive many teachers to do things mechanically because they’ve been told – directly and indirectly – that their professional judgment doesn’t matter. We can focus on publishers and textbook authors who reinforce some of the worst tendencies of teachers prone to teach mechanically. Or we can look at the larger culture of “school” as an institution in which learning is the booby prize instead of the ONLY prize. Where getting ‘right answers’ suffices for earning the Mickey Mouse stamps of grades, and where parents – and hence students – care about nothing else. Everything is a means to an end, but there’s no actual end, and the process is utterly devalued and trivialized. And then we may well conclude that we get what we ask for from our schools, though it might just not be what we or our children actually need. #### Doug McNamara So, I’ve been struggling with implementing in my classroom the last point you make: “Beals and Garelick have a valid point that teachers and schools often constrain the function (understanding) to form (boxes, columns, and rubrics). When students are forced to contort explanations to simple problems into complicated graphic organizers, like the one below from their article, we’ve lost our way.” My students LOVE (nay, CRAVE?) graphic organizers. I think the reason they do is because they really want to just mindlessly follow the steps on the graphic organizer without really understanding what they’re doing. By the time I get them in 10th grade Geometry, they have been fed a steady diet of these things and become completely dependent on them. Meanwhile, when I probe them to better understand their thinking, I often get unsatisfactory answers like “because that’s what the formula told me to do.” GAH. The analogy I have tried to use (with limited success) is to cooking. I want them to be chefs who create new recipes and not cooks who follow old ones. I want them to create new ideas and knowledge. They’re certainly welcome to write those recipes down (i.e. create their own graphic organizers), but only has a reference for later. To assume that the only students who are incapable adequately explaining their answers must have Aspergers is quite naive. I think that you may not realize that these explanations are required of children who are not developmentally ready to explain their answers. You are asking students in the Concrete Operational stage (7 to 11) to 1)Have counter factual thinking by explaining why an incorrect calculation has been observed and what can be done to fix it. 2)Have abstract logic and reasoning where they need to explain more complicated topics on a level appropriate for an older child. These are all things that the majority of the students will not be able to do until they become 12. These are concrete thinkers here. Depending on the question, asking them to explain their answer may be hindered by their development. Its a shame to penalize a student because they are on track developmentally. Remember, children are not small adults. #### Michael Paul Goldenberg Brett, the essence of what you say about the sort of mathematics teaching most commonly found in the US is in keeping with both my own professional observations in various districts, schools, and classrooms in Michigan and New York City over the last 25 years, as well as a conversation I had with Jim Hiebert about 11 years ago. He told me at that time, in essence, that you could select a classroom in this country at random and the probability of seeing the sort of teaching that NCTM had been advocating for over the previous 15 years was effectively zero. Obviously, that sounds absurd to some people, but I understood what Hiebert was getting at. And nothing I’ve seen in the last 11 years makes me think that there has been a radical shift towards what Hiebert was hoping to see, particularly in high schools. My experience suggests that while K-5 teachers are more open to reflecting upon and improving their math teaching practice, their relatively weak mathematical understanding makes that challenging in many cases. If you lack a repertoire for explaining why and how the mathematics kids are trying to understand in their primary and elementary grades actually works, you’re going to struggle to reach a lot of students who can’t get it from the one perspective you have a grip upon (however tenuous). For high school teachers, the problem can still be weak content knowledge, but it also tends to be a degree of defensiveness and/or arrogance that comes from ostensibly being on the top of the K-12 ladder. Those who were math majors generally were successful in their own K-12 student experience with math, and far enough beyond that to get a degree in the subject. And the combination of those things can conspire to make teachers less flexible, less empathetic, and/or simply less willing to make adjustments to the needs of diverse students. There can be a tendency to teach to the kids with whom they identify: the ones who get math at this level quickly and easily. Everyone else might be dismissed – consciously or not – as unintelligent or recalcitrant. The onus for student failure is then placed squarely on the shortcomings of the students, their parents, and colleagues who had these kids in earlier grades and “didn’t get the job done.” Reflecting on one’s own practice is always challenging and to some degree threatening. And that connects quite directly with why such teachers may be unlikely to really support helping students become skilled and comfortable with crafting good explanations of their own mathematical understanding: if you don’t want to be a reflective practitioner, are you likely to promote student becoming reflective mathematical thinkers? #### Dan Meyer The *abuse* of explanations that the Atlantic piece tackled, and that I commented on, occurs when explanations are required for essentially every – even the most trivial – homework and test items, or when “verbalism” is *demanded* even when proper clear symbolism has already been used by the student. This is what seems to occur in Common Core testing and in many Common Core classrooms and what I think Beals and Garelick – and I – find objectionable and counterproductive. No argument here, FWIW, though one’s confidence in the clarity of symbolism pretty much defines this whole thread. Doug McNamara: My students LOVE (nay, CRAVE?) graphic organizers. I’m not critiquing graphic organizers, in general. I’m agreeing with Barry & Katharine’s critique of this particular graphic organizer and its use in this particular problem. #### Lauren Shakespeare When students explain their work they think about what they are doing and the meta cognition allows a deeper understanding of the method. If you only focus on the final answer it stifles growth mindset. Pupils think there is a ‘right and wrong’ answer and that’s it. Through explaining you are putting the main emphasis on the process and praising that rather than a final answer. Encouraging the process rather than the solution. The answer in itself is not all that impressive when the whole class have it but the enginuity and problem solving which they undertook to get there is. #### Tracy Zager @Michael and @Brett, thank you for your thoughtful comments. Great contributions. I am constantly amazed by the amount of stress, angst, ink, and fury over reforms that, by and large, HAVE NOT HAPPENED. #### Katharine Beals I should clarify that what I mean by “American approach”: the approach inspired by national movements like the Common Core and the NCTM standards. Again, the internationally more successful approach has been to build conceptual understanding through teacher-directed instruction and individualized practice in challenging math problems. Finland is just one example. From the latest PISA (http://www.bbc.com/news/business-26249042) consider the 20 other countries that outcompete us. Most use the teacher-directed, mathematically challenging approach. I’m particularly familiar with the Singapore, Russian, and French curricula, but they are largely representative what’s happening in continental Europe and East Asia. #### Brett Gilland It is interesting to me that you chose to double down on your initial argument without actually addressing the critiques given. Pointing to the top 20 performing countries doesn’t actually support a given system unless you demonstrate that this system is distinct from the lower scoring nations. Moreover, it would help if you demonstrated that those countries with the best scores were more in line with this system than those who did worse. So, for instance, it would help to know if Sweden’s mathematical education system was significantly different than Finland’s. Since they are culturally quite similar, that might eliminate some confounding variables. Without this, you are cherry picking and not establishing significant impact from differences in pedagogical methods. Similarly, though it has been pointed out that NCTM/CCSS methods (as I take you to be using the phrase) are incredibly rare in USA schools, you simply reassert that this is what you mean by “an American Approach”. The problem here is that this is not actually the approach taken in most American schools. So that can be what you mean by an American approach. However, no statistical analysis of American success (on the PISA, for example) will actually be available, because this isn’t actually the approach to mathematics education that is common in American schools! Both of the above are fairly basic ideas from statistics and research design methodology. Please account for them. The Canadian study, I am more interested in, since we seem to at least have some attempt at quality research design. However, it will take some time to dig through it and I am quite frankly quite suspicious of research from clearly biased sources which has not yet been vetted by the larger research community, especially when it comes on the heels of such shoddy PISA based analysis. Not saying that it isn’t valid. Not saying it is. Just stating my priors. #### Katharine Beals Well-controlled experiments on this topic are few and far between. The Canadian study is one of them. What makes you think the source is “clearly biased”? In the absence of controlled studies, we can compare curricula, high school exit exams, PISA results (I said 󈬄 countries” earlier when I should have said “about 30 countries”), and the relative representation of American vs. foreign students in math/science BA, BS, MA, MS and PhD programs and in jobs that require high-level (or even non-trivial) math skills. We can interview foreign exchange students about how much they learned in their American math classes vs. their math classes in their home countries. We can spend time in foreign classrooms or read eye witness accounts. In my experience, all of this points in a troubling direction. Widespread, systematic studies and controlled experiments, of course, would be nice. What are your sources for “NCTM/CCSS methods” being “incredibly rare” in USA schools? (Keep in mind that NCTM-inspired curricula include Everyday Math and Investigations). I agree that high school math is still generally more “traditional” the problem here as that the mathematical material taught in U.S. high schools (and found in U.S. high school math texts) is, from everything I’ve seen, read, and heard, significantly below that found in most other developed countries, both in terms of mathematical challenge, and in terms of mathematical depth. It’s what Barry calls “traditional math done badly.” One reason for my confidence that few American high schoolers could handle the Finnish and Chinese exit exam problems I allude to earlier is that such challenging problems are rare in American textbooks. Note that several of the problems I linked to earlier are proofs. Ironically, despite the emphasis of the NCTM/Common Core’s on “conceptual understanding,” American high school math texts require fewer and fewer mathematical proofs than they did one and two generations ago. I will close with three statements that I can’t prove, but that I propose are worth considering: 1. Many American students–regardless of whether they turn out to be math whizzes–are more mathematically capable than is often thought, would be capable of doing much more challenging math if only American classrooms and textbooks were more like those in other developed countries. 2. Any student who can do the kind of proofs and other problems seen in the Finnish and Chinese exit exams has conceptual understanding. 3. Mathematically challenging problems, including (at the high school level) proofs, are a much better way to develop (and gauge) conceptual understanding than the so-called “meta-cognitive” approach that Barry and I discuss in our article. #### Michael Paul Goldenberg @Katherine Beals wrote in part: “What are your sources for “NCTM/CCSS methods” being “incredibly rare” in USA schools? (Keep in mind that NCTM-inspired curricula include Everyday Math and Investigations).” Keep in mind that textbooks are not teaching methods. I’ve seen many teachers using the textbooks you mention, as well as philosophically-similar books in middle- and high schools who barely follow the approaches suggested in the teachers’ manuals, for one thing. I’ve seen teachers where such textbooks are the official curricular materials put the books away and eschew their use. I’ve seen teachers teach from those texts in ways that would be anathema to their authors, turning what we can call “NCTM-style” mathematics teaching into traditional teaching with ostensibly progressive reform curricula. In brief, teachers who do not accept teacher-centered instruction, discovery learning, what is rather oddly called “constructivist teaching” (odd because no matter what gets said, constructivism is a theory of learning none of its theorists ever wrote a constructivist theory of teaching), etc., can easily stay light years away from it regardless of the official textbooks through any number of ways and do so routinely. Add to that the teachers who may be neutral or even positive about such books and, ostensibly, the teaching methods espoused by their authors who either don’t understand how to effectively implement the methods or who find that it simply doesn’t suit them to do so (and hence they quickly revert to the traditional approaches with which they were taught), and it’s quite easy to explain how Brett, I, Jim Hiebert, or anyone else who might be looking at classrooms in various parts of the country could assert that what I’ll lump into the term “progressive math teaching” is rare in this country, more’s the pity. #### Tempe Here’s what progressive maths looks like in many Aust. primary schools. Algorithms are learnt very late – maybe in year 4, maybe not at all… If you insist that kids need plenty of practice and need to rote learn formulas and procedures then you are not teaching maths properly ie no conceptual understanding. The conceptually part lasts for many years, finally moving to the abstract language, far to late in primary years. By this time kids have been constantly working with manipulatives (delaying actually “doing” any maths) and asked to explain concepts in English language. The result: Far to many kids have no foundation skills, have not committed anything to long term memory, so haven’t learnt anything and worst of all are confused by the strategies they are forced to use to demonstrate “deeper understanding”. So they don’t really come up with the strategies themselves (discovery/inquiry maths), they are rote taught the alternative strategies and then expected to apply these convoluted strategies, rather than the efficient algorithm. So there isn’t any deeper knowledge from the “invention” of strategies (which progressives constantly harp on about) and sadly for way to many students (except those who are tutored and taught standard algorithms) there is confusion. the end result: I can’t do maths. The Australian National Curriculum does not mandate the standard algorithms (so they are optional) but it does mandate strategies. Times tables must be learnt by end of year 4, but many teachers outsource this vital knowledge to parents, some of whom will never help their children reach this goal. So is progressive maths “real”? Does it exist in the class room? Yes it does. #### Michael Paul Goldenberg I’m sorry, but the 1989 NCTM Standards never called for what you describe, though many of its most vocal critics claimed that it demanded an end to paper-and-pencil arithmetic, to “standard” algorithms, etc., and also that math must be discovered by each individual student. No matter how often those claims are debunked, they rise again like the undead in a horror move to stalk living and stoke their fears. The facts, however, are quite different. The words that appeared throughout that volume were ‘less emphasis’ on X, ‘more emphasis’ on Y. That’s a pretty mild request (and that’s all NCTM could ever do: request shifts in emphasis). What individual teachers, districts, states, publishers, and textbook authors chose to do with those suggestions, how they interpreted and tried to implement them, was a highly diverse enterprise. But since I first arrived at the U of Michigan as a graduate student in mathematics education in 1992, I’ve read so much from people who claim that NCTM called for eliminating everything and anything “traditional” from US mathematics classrooms (and how similar organizations did the same in other countries, including England, Canada, and Australia). Never mind that the people I’ve studied under, worked with, gotten to know through their writing online, listened to at professional conferences, etc., seem pretty consistent in making modest suggestions and calling for shifts, not revolutions, in how teaching is done and classroom time is spent (and I’m speaking about people from many countries besides the United States): the claims just will not cease that supporters of progressive mathematics education are all mad professors experimenting on innocent children, always wrongheadedly and to the utter detriment of the nation’s and world’s youngsters. How, exactly, so many thoughtful, highly educated, dedicated professionals manage to get EVERYTHING wrong is never explained. It seems to suffice to make the claim that our mathematics classrooms are in a shambles, all due to progressive reform ideas and thinkers, while in those “other countries,” all is marvelous, all kids are above average, all teachers teach just the way that those who oppose NCTM, et al. were taught once upon a time in the good old days. Well, maybe that’s true. But it pretty well defies credulity, as well as runs counter to my experience over the last quarter century. I think it runs counter to the experience of a lot of very smart, creative, thoughtful mathematics teachers who contribute to this blog and other practitioner blogs I’ve been following for going on a decade. So you must excuse me if I read the description of Australian math education with some skepticism. It’s too one-sided, too different from the work Australian colleagues over my career have provided, too much like similarly slanted descriptions of US classrooms I’ve read that go far afield from my own observations in schools throughout southeastern Michigan and in New York City. As the cheap headline writers are wont to put it: It Doesn’t Add Up. #### Dan Meyer I’m interrupting this discussion of international comparative ed to call out an earlier exchange between Brett Gilland and Ze’ev Wurman. I excerpted it into its own post as it seemed like a precise encapsulation of two perfectly diametrical views on learning. They did it right: everyone walks away feeling well-represented everyone walks away feeling like their side won. That’s good writing. #### Brett Gilland I want to address your last comment with the care it deserves so I am breaking this into several responses to draw out different threads in your last post. “Well-controlled experiments on this topic are few and far between. The Canadian study is one of them. What makes you think the source is “clearly biased”?” Initially, the red flag was that it was only really addressed on clearly biased sites and had very few further citations from other published paper. That is a big red flag. Now that I have had more time to look over what data I can find (and looked at the earlier draft of their paper), I find their discussion of the PISA to be particularly problematic. To hear them tell it, Quebec either held steady (which would not support their findings) or declined. However, when you look at the data (http://cmec.ca/Publications/Lists/Publications/Attachments/318/PISA2012_CanadianReport_EN_Web.pdf), specifically Table 1.6 in the link, you notice that PISA scores grew pretty steadily (though by small amounts) over that time, while other provinces declined. That does not build confidence in their case. In fact, the fact that they fail to consider the performance of other Canadian provinces over the same time is a warning signal on its own. The lack of a control comparison when one is available suggests an unwillingness (or at least a lack of interest) to account for possible historical threats to validity. This might explain the fact that while many Canadians readily acknowledge a poor roll out of the change in curriculum (http://www.cea-ace.ca/education-canada/article/15-years-after-quebec-education-reform-critical-reflections), no one in Canada really questions whether it was effective. More recent results have supported that claim. (http://www.theglobeandmail.com/news/national/education/quebec-students-place-sixth-in-international-math-rankings/article15815420/). And by 2006, people weren’t questioning what Quebec was thinking, but instead were trying to figure out why their provinces had failed so miserably to follow their lead (https://www.umanitoba.ca/publications/cjeap/pdf_files/raptis.pdf) “In the absence of controlled studies, we can compare curricula, high school exit exams, PISA results (I said “20 countries” earlier when I should have said “about 30 countries”), and the relative representation of American vs. foreign students in math/science BA, BS, MA, MS and PhD programs and in jobs that require high-level (or even non-trivial) math skills. We can interview foreign exchange students about how much they learned in their American math classes vs. their math classes in their home countries. We can spend time in foreign classrooms or read eye witness accounts. In my experience, all of this points in a troubling direction. Widespread, systematic studies and controlled experiments, of course, would be nice.” We can do all of these things, but we must do so in a way that is rigorous and with an eye to challenging our biases, not to confirming them. Given that my experience with all of these considerations point in the opposite direction from yours, I suspect we may be at a stalemate of priors. And while I hate to beat this into the ground, half (or more) of your comparisons only hold if you can show that American mathematics education is significantly progressive at any level. I will take that question up again in my next comment. #### Tempe Whether these ideas were prescribed by a body of any kind is kinda beside the point. The point is, if you don’t make it really clear what it is that is required (heavily prescriptive) then you are left with the described situation in Aust. schools. Clearly our curriculum is far to weak and has ‘loopholes” which “allow” teachers to not teach algorithms or times tables. Why is it so many teachers are “teaching” maths this way? Because what is being hawked at University is that rote is bad, discovery is good and in maths the standard algorithms inhibit deeper knowledge, which is absolute nonsense…Where are those sort of ideas coming from? #### Michael Paul Goldenberg @Tempe: It is impossible for me, at least, to meaningfully engage in that conversation if you’re going to insist upon such an extremely one-sided take on what you describe. I don’t think putting things as you do is conducive to exploring. You have a strongly-held point of view (all this stuff is bad) and you want agreement from me and others. You have a villain (university teacher educators). You appear to have a hero (those who support the good old days). That’s not how things look to me from inside of classrooms, particularly not in the low-income, high-needs schools and districts where I coach secondary teachers. I’ve only been to Australia twice, and was not in a position to visit schools on those trips. But I’ve known mathematics educators from Australia. They were without exception a bright, hard-working, insightful bunch of people. I believe they would take exception with how you’re characterizing things there. But all that notwithstanding, if you’re after a productive conversation, I can’t offer you one given the stance you’re taking. Maybe others can do better. #### Education realist Katharine’s propositions are nothing more than religious beliefs. 𔄙. Many American students—regardless of whether they turn out to be math whizzes—are more mathematically capable than is often thought, would be capable of doing much more challenging math if only American classrooms and textbooks were more like those in other developed countries.” The kids in other developed countries that you approve of are either white or Chinese. They track out low performing kids by age 15, which is approximately when high schools pick them up. In Germany, Turks are far less likely to qualify for higher education. Finland has, historically, very few immigrants. Countries with fewer immigrants do better on the PISA. I teach in a Title I high school, am considered very good at working with unmotivated kids. I am also, for what it’s worth, the go-to teacher for kids who have Aspergers and autism, having worked successfully with high functioners in both categories. I have one now. I have taught every subject in math from math support through pre-calculus, and every one of those subjects I have taught for much longer than Barry taught algebra. I’ve taught *English* for longer than Barry has taught algebra. I’m not saying that to mock Barry, but simply to demonstrate the paucity of experience your side has. I also tutor kids of high, medium, and low ability, from fifth grade through graduate school. I know of absolutely no one with anything approaching the range of my experience who would sign on to your assertion. And while I have less than 10 years experience, there are few who can match me for subject range and student diversity in SES, ethnicity, and ability. It is simply an absurd pipe dream to believe that math achievement in this country would be even substantially improved by curriculum or instruction methods. I think we could get a bump by convincing more kids to try. It wouldn’t be a big bump, though, and curriculum and instruction methods wouldn’t be nearly as important to outcomes as eliminating homework. However, I do think that straight lecture and procedures are only going to work with kids who want to get good grades, and that’s a very small group. It’s worth remembering that one of the big issues with high school math and the idiotic expectations we’ve piled on all kids is that we have destigmatized an F for an alarming swath of the population. So failing isn’t a big deal. And at the low end, failing isn’t a problem because they know the school, with an eye to its graduation rates, will bail the kids out rather than let them continue to fail. 𔄚. Any student who can do the kind of proofs and other problems seen in the Finnish and Chinese exit exams has conceptual understanding. ” The Finns track. The Chinese cheat. The gaokao is not one test that everyone tries their best on. It’s multiple tests in many different provinces, and it’s much easier in the big cities. Cheating is rampant. When kids study, they aren’t working problems or learning math, but simply memorizing. It’s not even procedural. And it’s very likely that the top scorers cheat. I wrote about this, with cites: https://educationrealist.wordpress.com/2015/01/18/what-you-probably-dont-know-about-the-gaokao/ Cheating in Asia is so prevalent that it’s absurd to even pretend we know what’s going on there. 𔄛. Mathematically challenging problems, including (at the high school level) proofs, are a much better way to develop (and gauge) conceptual understanding than the so-called “meta-cognitive” approach that Barry and I discuss in our article.” Your article didn’t discuss high school math at all. You discussed elementary school and one middle school example. I agree that mathematically challenging problems are a good idea. Proofs, however, are a waste of time for a big chunk of the population. I’ve read Barry’s book, or at least the excerpts on your blog. Barry taught algebra very much like 90% of the algebra teachers in America. And by his own admission, it didn’t go very well for the limited amount of time he was able to teach. I speak as someone who has a track record of motivating the kids Barry failed with. Again, I’m not saying that to dismiss his experience. He seems like a well-meaning teacher who probably would have realized he needed to change his approach over time. By the way, there’s almost no research done on high school math, outside of algebra. It’d be nice to change that, but researchers are well aware that the ability range once we move past the low-achievers stuck in algebra is so vast that no conclusions could be drawn. #### Brett Gilland (Continuing my conversation with Katherine) “What are your sources for “NCTM/CCSS methods” being “incredibly rare” in USA schools? (Keep in mind that NCTM-inspired curricula include Everyday Math and Investigations).” I really don’t mean to be rude, but I have a question. Have you ever worked in K-12 mathematics education? I don’t mean that sarcastically or as a rhetorical question. I just noticed that we seem to be talking past each other from time to time and it seems to be linked to either a lack of exposure to American mathematics classrooms or an exposure to classrooms that are wildly different from those I have experienced in the three states in which I have worked. Thus, the short answer to your question is “over a decade of experience teaching and collaborating with educators in three different states (and collaborating with others who work as trainers in many more states than that). The long version… well, Michael addressed a lot of that above. Assessing what is actually happening in the mathematics classrooms of America by the adopted curriculum would be as silly as trying to assess schools of education by simply looking at their syllabi. No one would ever take such a process seriously. (And yes, my tongue is firmly in cheek right now.) If you aren’t going into mathematics classrooms on a regular basis, preferably unannounced and with no possibility of it being taken as evaluation, then you have no basis on which to ground your claims. I should add that Michael’s explanation covers a large amount of what I see at the HS and MS levels. But there is another factor that massively affects mathematics education at the elementary level- mathematical incompetence. The simple truth is that many, if not most, elementary school teachers are very open about their weakness in mathematics. As such, they tend toward an algorithmic approach to mathematics instruction and are unable to actually implement any curriculum particularly well. However, that does make them particularly unable to teach reform curriculum. Now, this may leave you convinced that reform curriculum should just be abandoned. I know others at both elementary and HS levels that were very successful teaching reformed mathematics who maintained that it shouldn’t be implemented broadly for exactly this reason. I respect that opinion, but believe that the better response is math specialists at all levels that possess a deep and nuanced understanding of mathematics. “I agree that high school math is still generally more “traditional” the problem here as that the mathematical material taught in U.S. high schools (and found in U.S. high school math texts) is, from everything I’ve seen, read, and heard, significantly below that found in most other developed countries, both in terms of mathematical challenge, and in terms of mathematical depth. It’s what Barry calls “traditional math done badly.”” The (very small) data set I have to work from doesn’t support this claim. However, I am quite aware of how limited my knowledge of foreign mathematics education is and won’t try to extend that to an argument that this can’t be true. That said, your argument from test difficulty really doesn’t hold water, as I will address in my next comment. #### Greg Ashman I’m just jumping back in here to point out that argument from experience is essentially fallacious. If someone’s lack of experience leads them into error then we should point out what that error is rather than attack their experience because that just makes things a bit nasty. Similarly, if you are right then it won’t be because of your experience, it will be because you have hit upon some logical or empirical truth which you should be able to demonstrate. If argument from experience *were* valid then I might point out that I have taught a number of students who transferred to Australia from China, Singapore and other Asian countries and that they all have had a conceptual understanding of maths that was far in advance of most of their Australian peers. This is one reason why I find it hard to accept claims that “the Chinese cheat”. In fact, this strikes me as a little bit racist. #### Andrew I recently responded to both Dan’s post and the original article with a piece of my student’s thinking. Would those who think only arriving at the “right” answer conveys mathematical proficiency be satisfied with this student’s understanding of place value? #### Brett Gilland “One reason for my confidence that few American high schoolers could handle the Finnish and Chinese exit exam problems I allude to earlier is that such challenging problems are rare in American textbooks. Note that several of the problems I linked to earlier are proofs. Ironically, despite the emphasis of the NCTM/Common Core’s on “conceptual understanding,” American high school math texts require fewer and fewer mathematical proofs than they did one and two generations ago.” The first important note here is that most Chinese can’t handle the gaokao. It is notorious for the fact that it is extraordinarily high stakes, has caused multiple suicides, and excludes over 2 million students from college every year. (http://www.businessinsider.com/24-stunning-photos-of-chinas-college-entrance-exams-2015-6)(http://www.bbc.com/news/world-asia-china-33059635) I can’t speak to what the success rate is for Finnish students. As to your three assertions… 𔄙. Many American students—regardless of whether they turn out to be math whizzes—are more mathematically capable than is often thought, would be capable of doing much more challenging math if only American classrooms and textbooks were more like those in other developed countries.” You and I are in agreement that American students could do better in math. But the textbook isn’t the issue. I have taught with traditional textbooks and with reform textbooks (which have a lot of the proof and application problems you think American texts should have more of). In my experience, the defining issues are (1) teacher belief about student abilities, (2) student willingness to struggle with difficult material, and (3) extent to which students are encouraged to be mathematical thinkers instead of hoop jumpers. Textbooks are waaaay down the list. 2. Any student who can do the kind of proofs and other problems seen in the Finnish and Chinese exit exams has conceptual understanding. You overestimate the exams you are looking at. Your links were dodgy earlier, but I was able to find this post on the Finnish exam: http://oilf.blogspot.com/2014/10/high-stakes-testing-in-finland.html Problem 1 is basic simplification with factoring. They even made it difference of squares to make it easier. It isn’t particularly demanding and could definitely be done without much understanding. Problem 5 is trickier (I think, but my Geometry is definitely weaker than my Algebra, so I hesitate to comment too strongly). I suspect there is a standard process for problems like this, as well, but I defer to others who might know better. Problem 6 has some weird wording (perhaps it is easier in Finnish), but the solution is x=0, which requires a little bit of critical thought about squaring. Probably better for conceptual understanding. Great CCSS/NCTM problem once one gets past the wording. Problem 9 is the sort of question that could easily show up on the CCSS (and drives me nuts because it is testing too many different concepts at once and is too easy to get wrong for a myriad of different reasons. Bad test design). However, it isn’t particularly challenging. The 3-d intersections are just plugging in 0 for the other variables. The area question is CCSS Alg 2 along with Law of Sines, I believe. I know it is A2 in our (reform) textbook. Problem 13 is a variation on the proof for the sum of an arithmetic sequence. Algebra 1 in the CCSS, btw. We don’t require a proof, but if we were going to require one, that would be one of the easiest to pick. And reform mathematics programs will typically explore that proof (geometrically, and then with algebraic symbols to formalise). Traditional American textbooks just give the formula as something to memorize because, get this, accurate procedures without understanding are considered sufficient. So the emphasis is definitely different, but the questions aren’t substantially more challenging, especially when compared to the current round of PARCC/SBAC tests. The gaokao example I could find (http://oilf.blogspot.com/2015/10/math-problems-of-week-sample-12th-grade.html) isn’t particularly difficult if you have been through calculus. Part I is a straightforward derivative problem with a brief note that e>2. The proof would be fairly typical for an AP Calc written response question. I am honestly not sure what part II is asking. I am going to blame that on translation. 𔄛. Mathematically challenging problems, including (at the high school level) proofs, are a much better way to develop (and gauge) conceptual understanding than the so-called “meta-cognitive” approach that Barry and I discuss in our article.” I am glad you are coming over to the reform side, at least at the HS level :o). However, as someone who uses challenging problems and proofs to assess learning, lack of student description of their thinking would render both pretty useless. Unless you just mean don’t use that terrible mandatory mind mapping procedure, in which case everyone agreed with you long ago while noting that your argument from the single case was massively undersupported. #### Education realist On reading Brett’s response, I remembered belatedly that the folks here talk teaching, not policy. I’m a policy nut, and Katharine made a policy case, so I responded on those terms. I’m not going to pursue this save to respond to Greg’s criticism. I don’t think it’s possible to “attack experience”. You can attack validity of experience, conclusions from experience, or restriction of experience. Barry is, from everything I can see, an expert in elementary math pedagogy. He’s been reading and thinking about it for a long time, and has worked with other experts. I think Barry would readily agree that he’s not an experienced high school math teacher. It’s a shame he wasn’t given the opportunity to expand his experience, most likely due to age discrimination, which is a real problem in school hiring practices. I found his memoir credible. It also revealed good teaching instincts. He described his method well, and also revealed that the results weren’t as effective as he expected (and in that, he’s welcome to a very large club!). I should also make it clear that I often agree with Barry’s reasoning, I just think his conclusions (that we’d get better results) are unfounded. But if Jo Boaler were to show up in the comments thread, I suspect I’d side much more with Barry than Jo. Greg says arguing from experience is fallacious. In teaching, given the lack of data, experience is what we have. Rest assured, convincing *teachers* to adopt a desired policy requires experience or data that is a) demographic specific and b) leads to an improvement so significant that the effort to change is worth it. When Katharine and Barry (and the other adherents) argue that forcing first graders to explain their thinking is inappropriate given the complex nature of the underlying math, they are on reasonably solid ground. They have a case supported by both data and practice. When they moved to the middle school example and percentages, their argument lost both. First, high school math teachers typically think students lack all conceptual understanding of percentages. Saying “they need procedural fluency over conceptual understanding” is far less compelling. At the college level, too, math professors complain that remedial and lower level math students lack all conceptual understanding of percentages. Second, the argument lacks any research supporting the advantages of procedural fluency. Finally, K&B ask: “Is it really the case that the non-linguistically inclined student who progresses through math with correct but unexplained answers–from multi-digit arithmetic through to multi-variable calculus–doesn’t understand the underlying math?” and on that point, clearly, experience matters and the answer is, from many high school teachers, yes, it can be the case. Dan’s opening post raised that possibility, and many teachers chimed in with knowledge of “zombies”. Arguing from experience isn’t fallacious in teaching, although there are of course limitations. It’s particularly appropriate when the other side is trying to argue from logic and clearly lack experience. Katharine then makes three unsupported assertions, comparing us time and again to other countries. As no other country has America’s considerable heterogeneity in income, race, and ability, I consider all comparisons of this nature utterly pointless. But China, in particular, is egregious. Here I didn’t argue from experience, but data. China’s corrupt academic practices and culture are well-established, ,and indeed shared throughout most of Asia. We have no idea what is typical for a Chinese student, not even a suburban kid from Beijing, because we don’t know the degree to which the tests are made more difficult as an attempt to thwart cheating. It goes without saying, but I’ll say so anyway, that China’s corruption doesn’t lead to the conclusion that all Chinese people cheat. On an experience basis I, too, have extensive experience working with Asians from the mid, south, and east of the continent. As America has a much more permissive immigration policy than Australia, I’ve probably seen a much broader range of ability: a) outstanding procedural and conceptual fluency b) excellent procedural fluency, acceptable conceptual fluency within limited parameters. c) rote zombies d) utter incompetence, unable even to reach zombie status. b is the mode. Next c, then a, then d. Americans (regardless of race) don’t fall into those categories as neatly. Many of our students are incredibly uncomfortable with c, and less determined to reach b. So we get a and d, and then a lot of flailers. #### Frank murphy The big problem with asking students to explain their work on assessments is that the explanation must be done with pencil and paper. It is done much more efficiently orally in the classroom. I can’t imagine anyone disagreeing with the use of asking “why?” In the classroom. I don’t think the answer to 6 is 0. It ought to be in terms of the a’s. Both x and any a_i can be negative, meaning x is merely a translation any direction along the number line before you sum squares. There’s no reason to believe the optimal translation is 0. If a_1 is 0 and a_2 is 1, x=-0.5. I like this problem. It’s clear that it has an answer but not clear what that answer is. Actually, let me follow that up. You can take the derivative of the sum of squares with respect to x to get d(sum)/dx = sum(2x + 2a) = 2nx + 2sum(a) For this to equal zero (i.e. be a local minimum – can we agree the graph is concave upward and continuous) we want x = -avg(a), which is what intuition suggested (to me at least) and supports the use of averages as a “best fit” data point for a bunch of one-dimensional data. Sorry. I know this is a thread on math pedagogy, not math. I’ll leave now. #### Brett Gilland Lola. You are correct. I am just going to hang my head in shame and blame it on fatigue. Now I just want to play with the problem and see what comes out. #### Michael Paul Goldenberg Far too much of the conversation here is being conducted under a lot of false premises and worst-case scenarios which I very much doubt the progressive practitioners here and elsewhere would support as effective teaching. As soon as someone starts talking about “forcing 1st graders to ______” there is going to be a visceral reaction from those who don’t see whatever goes into the blank as absolutely necessary. But as knowledgeable mathematics educators know, what comprises “explanation” in classrooms where teachers aren’t operating either with some sort of high-stakes testing gun at their heads or where they developed their practice slowly over time, grounded in increasing pedagogical content knowledge so that they can more easily resist being bullied into mechanically applying the very idea of asking for explanations, what comprises appropriate explanation in various grades and at various ages is not some monolithic idea that can readily be put onto a standardized test. Look at the classic 3rd-grade lesson from Deborah Ball in which students are asked to explain their understanding of odd v. even numbers. Shockingly, no one offers an algebraic explanation. But much of great value is revealed that simply would not have emerged in traditionally-taught classrooms. I don’t expect everyone to agree. Indeed, I don’t believe there is an amount of evidence that some people in this conversation would admit makes it reasonable to ask students to provide explanations of their thinking on a regular basis. That’s just the nature of these debates, I’m afraid: a great deal of entrenchment. I do have to wonder, however, how many teachers and parents who had the opportunity to view the teaching of people like Ball, Magdalene Lampert, or some of the practitioners here would be able to honestly reject what they see as wrong-headed, harmful to children, inferior to traditional mathematics teaching in K-12 (and particularly in K-5). I suspect that number is a good deal smaller than some math warriors would like to believe. And it is clear that progressive, student-centered teaching isn’t going to vanish by fiat. So maybe some of those who seem to abhor it out of hand should reconsider 100% opposition. Or not. All that said, the idea that we have an alignment between the Common Core Initiative (and its concomitant assessments) and teaching that seeks to put more emphasis on student understanding, explored in part by expecting students in various ways to examine and discuss their thinking, strategies, problem-solving efforts, etc., is specious. I see no fundamental agreement between progressive teachers here or elsewhere and the folks who’ve given us the Common Core or the assessments thus far produced by the two testing consortia about what comprises good assessment. I have absolutely zero investment in or commitment to the Common Core as a political and economic instrument for further undermining free public education. I have none, either, to any instance of curricular materials that Big Publishing is selling to fit their notions of what the Common Core calls for. In other words, the Common Core is an enormous red herring in this conversation. The issue of probing student thinking in mathematics exists for teachers regardless of the existence of CCSSI or any particular books. In my work as a math coach, one of my constant refrains about textbooks is that they are resources, not bibles. That some teachers, administrators, students, parents, or politicians fail to grasp that is unfortunate but ultimately something to be critiqued and, it is to be hoped, dispelled as another artifact of 19th century thinking about pedagogy (at best). Bad assessment, particularly when forced through the narrow portal of high-stakes, standardized testing, is another vestige of weak thinking about education that we must repeatedly struggle against, with or without some gigantic edifice like the Common Core driving it. But one of the biggest mistakes we can make is to claim that getting students to reflect on their own thinking about mathematics and mathematical problem-solving and sharing that thinking with teachers and peers is a bad idea because such and such a textbook, teacher, or test gets that badly wrong. The answer is to improve and broaden the base of understanding among teachers and other stakeholders about how this can be done effectively and used to the advantage of everyone, not to drop the idea. #### Katharine Beals I appreciate your discussion of the Finnish problems. Can you share with us some problems from the PARCC/SBAC that you consider to be of similar mathematical difficulty? And can you share with us some problems from NCTM/CCSS-informed textbooks that you consider to be of similar mathematical difficulty? (FWIW, when I talk about traditional American textbooks, I’m referring to textbooks that date back to the 1960s and earlier, which contain more proofs and more conceptually challenging math than most contemporary American textbooks, and more closely resemble textbooks still used in other developed countries.) #### Michael Paul Goldenberg I would avoid making this discussion about SBAC/PARCC or about textbooks. Many of the teachers here have worked hard to get out of the trap of publishers’ notions of how to teach mathematics. To allow this to become just another 1990s-flavored Math Wars conversation is almost certain to undermine the value of what’s really under discussion: a particular emphasis on something that for too long has been ignored in our mathematics classrooms, explanation. By the way, I’ve been meaning to mention that mathematical proofs are in part about someone sharing his/her mathematical reasoning with others. Of course, at the professional level, the discourse demands a level of abstraction and sophistication (“mathematical maturity”) that is far from what we expect in K-5 or even K-14. College students do not for the most part begin to learn how to do real mathematical thinking (if they ever do) until they finish the basic calculus sequence. Nonetheless, it is commonplace in mathematics for people to be expected to explain their reasoning and no one will get very far at all if all she can do is produce the correct numerical answers to computational problems. As Keith Devlin already suggested, we have loads of tools for doing that faster and more accurately. There is absolutely no reason not to expect students to be able to demonstrate their reasoning on a regular basis (though not for every problem or for trivial cases where at their grade level and personal development, it’s silly to ask someone to “explain” a result). I seriously doubt that there is a single person in this discussion advocating the sort of absurdity that we can all find examples of. Now, will we continue to beat against the patently obvious, or will we move towards a deeper and more productive level of investigation into how to go forward with explanation in various grades and contexts? #### Brett Gilland I understand your concern here, Michael. However, I also believe that textbooks really do shape the world we work in. One of the reasons that I am really intrigued by the work of the MTBOS as of late is that they are starting to formalize and piece together curriculums instead of just offering modules or questions. I have great respect for those that generate their own curriculum from scratch. I used to do it myself. However, I also don’t believe that this can be the basis for high quality mathematics education at a national or even state level. This is primarily because it falls afoul of the “reforms that ask for teachers to work harder are doomed to failure” maxim. Maintaining work/life balance and high quality classroom interactions (along with grading and various other requirements of the job) is tricky enough without adding “autonomous curriculum development” to the list. I can’t wait for Dan, Christopher and the rest of the team at Desmos to release the first great textbook of the new century. Until then, I content myself with another teacher led effort from the last one, a program that I KNOW makes it easier to be a good teacher instead of making it harder. So, Katherine, I will bite. I will pick a few at random from my textbook’s homework help section (help for every assigned problem, scaffolded heavily at first and with reduced support in later lessons when the concept has been addressed a few times). I should also add that I am not picking for problems I love in this instance, but looking for problems that seem to meet the parameters of what you consider high quality critical thinking questions. I am using the HW help section because it is freely available to all. First section chosen at random from Algebra 2 included a gem in problem 4-26 (http://homework.cpm.org/cpm-homework/homework/category/CC/textbook/CCA2/chapter/Ch4/lesson/4.1.2) Click the problem number on the left. I liked all of the questions in this assignment from the first Geometry section I chose. (http://homework.cpm.org/cpm-homework/homework/category/CC/textbook/CCG/chapter/Ch7/lesson/7.3.1) The expected value work in 7-121 is nice because it forces kids to come at the problem from multiple angles. 7-121c is the sort of thing I think you like, in partiular. The proof work on 7-122 is typical, but do note that most requests for proofs in our text include the possibility of a proof not being possible, in which case we explain why. 7-125 is interesting because it causes students to brainstorm about what is necessary to describe a shape and what is sufficient. Class conversation about that one the next day should be excellent. The question is almost certainly is a preview of the next lesson. First randomly chosen set from Algebra 1 (http://homework.cpm.org/cpm-homework/homework/category/CC/textbook/CCA/chapter/Ch3/lesson/3.2.4) 3-71 is a nice problem that mixes multiple representations to generate a rule for an arithmetic sequence. Atypical in that it requires connecting the representations to make sense of the problem. 3-74 is a straightforward question about closure of sets under a given operation, but seems like the sort of thing you would like for its ‘proofy’ aspects. I can go on, but honestly you can go look for yourself. Our adopted curriculum is saturated with stuff like this. It is also not very widely adopted because kids freak out when they encounter problems like this without being specifically told how to work them (your complaint, as well) and so they scream bloody murder and turn to articles like yours to trash progressive textbooks and get them thrown out of their districts. The irony is hopefully not lost on you. PARCC Algebra 2. PBA Test Items from #13 on all seem to hit what you are looking for. I think you would like #14 in particular. http://parcc.pearson.com/resources/practice-tests/math/algebra-2/pba/PC194854-001_AlgIIOPTB_PT.pdf Algebra 2 EOY Item #1 is a nice look at polynomial divisibility (or it can turn into a slog of guess and check. # 2 is, IMO, harder than the polynomial expressions question on the Finnish test, as it requires the student to deal with extraneous solutions. #7 is some pretty nice conceptual work on exponents. Etc. I should reiterate here that I am very mixed on the PARCC test. The sample questions last year had issues with combining multiple concepts within one question, making the data gained from such tests problematic. In addition, switching it from paper based to computerized is a nightmare. That isn’t a natural medium for much of what our students do and made explanations (especially symbolic explanations) damned near impossible. I also have concerns with what level of proficiency should be expected for HS graduation, but that would be true of any standardized test. The examples above were just to give you some evidence for the claim that US kids weren’t being deprived of these sorts of questions. Nor, if they use halfway decent textbooks (of which there are far too few and too sparsely distributed), are they deprived of them in their daily math work. #### Barry Garelick While I write about elementary math pedagogy, my teaching credential is secondary math. I have not taught elementary school math, and my subbing experiences have been in high schools and middle schools. My book describes two situations a six week stint at a high school and a semester long stint at a middle school. The algebra classes at the high school did not go well for a variety of reasons, the major one being that there were enormous math skill deficits among many of the students. For the record, the majority of students were white, and many had very bad family situations. The middle school algebra class went well. I may have mentioned at one point a panic when the class did poorly on a test, but said that generally class averages were in the high 70’s, low 80’s. I didn’t go into detail about grades, but the majority of students got A’s and B’s in my algebra classes there was 1 F and 3 or 4 D’s given to students who shouldn’t have been placed in algebra. #### Paul Bogdan While understanding may be the ultimate goal, it is confusing the issue because communication is the immediate goal (not explaining or justifying or giving reasons either). The language of mathematics is often arithmetic mixed with algebra. I have been emphasizing (to my M2 HS students) that it is always the student’s job to communicate their solution (to formative assessments) to me. When it is tough, like with systems, it is still their job. The teacher must be demanding. For example (with linears): Using y=mx+b without stating it is poor communication. Not explaining or showing why m=4 (for example) or why b=2. Never mentioning any math words like slope, or y-int, or delta y, or delta x, or making a poor graph or a sloppy incomplete table. I call this ‘poor communication’ (and deduct as much as 25 percent) and the students are getting it. I like what they are showing me and I think they are enjoying the opportunity to treat their work as a conversation with me. #### Katharine Beals Thanks, Brett, for these various links, and for your suggestions about which problems to look at. I hope to blog about a number of these in upcoming posts on my Math Problems of the Week series. My analysis will undoubtedly provide support for one of your earlier statements: people often see what they want to see. That aside, I find it useful to take a very close look at what students around the world are and aren’t being asked to do–independently of whatever else is, or isn’t, going on in their various classrooms. #### David Griswold This last turn in the conversation is interesting, because it doesn’t actually seem to have anything to do with the original debate. At least not to my reading. I would love it if we could get students to the level of mathematical ability that that Finnish exam seems to require. Of course, the link stated that they only need to complete 10 of the 15 questions. I’m going to estimate that if a student can correctly answer, say, 7 of those, then they will pass. This is not a low bar, but it’s not as high as might be implied. Also, according to the Wikipedia entry on Education in Finland, only 42% of the population, approximately, completes that matriculation examination – 50% of the population never takes the academic track of the last two years of high school at all, and this is in a population that by and large is not having quite as many cultural battles about the value and importance of education as we are having. This in a country whose median net household worth is around120,000 (110000 euros) compared to \$69,000 in the U.S, despite a 60% (on average) tax burden.

Still and obviously, America is not even doing well enough to get 42% of the population, or even 42% of the non-poverty-stricken population, to that level, but blaming that on any sort of traditionalist vs reform curricular choice seems a bit bizarre to me.

First of all, a point of historical argument: in the 1960s, textbooks had harder problems in them because the vast majority of Americans never came close to completing high school, so those books were never seen. In 1960, around 45% of urban Americans and 32% of rural Americans graduated high school. By 1970, it had gone up about 10 percentage points in each region. My father, who has worked as an outrageously successful medical doctor for 40 years, graduated in 1970 at the top of his high school class and earned a full scholarship to college and admission to a top tier medical school having never taken a calculus course, so he DEFINITELY would not have been able to pass the Finnish exit exam based on his exposure to those halcyon days. To pretend the American textbooks and education of that era are superior to now is simply to ignore the fact that very few students ever used them, and those that did were a select, privileged, and talented few.

Interestingly, I teach at a high-end independent school, and the select, privileged, and talented are my purview. I truly believe that there is not as much mathematical understanding or knowledge in their heads as their should be, but I certainly can’t blame any sort of reform curriculum for that: my students pretty much entirely come through traditional-as-possible curricula, filled with memorization, mnemonics, and mathematical mimicry. Our AP Calculus students – AB and BC both – do 30+ exercises a night, exactly in the AP format, memorize every possible way to integrate or derive, and generally ace the hell out of the AP Exams, which may not be quite difficult enough for your taste but are certainly as difficult as we get in any standardized way. That Finnish exam would eat them for breakfast, though, because you are right that they don’t really understand math in that way and, perhaps more importantly, they’ve never seen an exam like that.

But here’s the thing: there are two reasons Finnish students might be passing that test. Option one: they have seen problems similar to those enough that they can reproduce them. This is traditional mathematics education in its strength, and is how the students at my school so regularly dominate the AP Calculus exam. Option two: they are solid mathematical thinkers who can approach problems they don’t know how to solve without freaking the hell out, calmly apply knowledge they know is in there somewhere, and come up with a solution to a difficult problem they HAVEN’T seen before.

If option one is the method you want to encourage, then, sure, make the PARCC harder so it looks like that. Pay people who actually know math to grade it and you might be able to make that happen (ask College Board how they do it on the AP Statistics exam, if you don’t mind paying that high a price for each student). Publish enough sample tests that it can be taught to. Engage in some high quality, high stakes chalk and talk. Make sure that you find a way to track the ones who can’t “hack it” to a lower track that doesn’t need it so that you don’t have to be embarrassed by their scores. Make sure to blame their elementary teachers when they forget how to divide fractions in the middle of a calculus problem.

But if option two sounds like the better choice, then you believe in the tenets of reform education. The entire point of reform curricula, as I see it, is to encourage and require students to engage in mathematical reasoning ALL THE TIME. To become used to problems they are uncomfortable with. To get good at taking a deep breath and trying something. Most reform curricula attempt to include at least some of the research techniques that have been shown to improve long-term retention, such as mixed homework and test questions, which would definitely improve student ability on a multi-year exam like the Finnish one.

Of course, you can get this sort of thinking without using a reform curriculum. You can assign the hard problems in any Larson book, have students work them and discuss them with limited scaffolding, and get the type of mathematical engagement and thinking that this exam requires. But if you do that you are teaching a reform class with a traditional textbook. And probably requiring a LOT of homework in the process.

My favorite reform curriculum is the one used at Exeter, found here: https://www.exeter.edu/academics/72_6539.aspx . I’m sure many would be shocked to hear that Exeter, with all the New England Prep School implications, uses a reform curriculum, but there it is. Problem-based approach, problems worked in teams and discussed as a class, with the teacher serving as a moderator and facilitator. Is there direct instruction? Surely, as there is in any decent reform classroom. These students are certainly as or more prepared for an exam like the Finnish ones as anybody, and a reform-style curriculum will get them there.

If we had a graduation exam similar to the one in Finland, perhaps a push toward Exeter-style curriculum would be possible everywhere. With my current realities (in which I can only assign half as much homework as them, if that) I have to choose: an efficient chalk-and-talk that teaches them to do lots of problems but not argue why they can do them, or a less efficient exploratory curriculum that encourages them to explore, discover, and debate mathematics at the expense of content. And when it comes down to it, I think THAT is the heart of the debate here.

#### Dan Meyer

Katharine Beals:

Thanks, Brett, for these various links, and for your suggestions about which problems to look at. I hope to blog about a number of these in upcoming posts on my Math Problems of the Week series. My analysis will undoubtedly provide support for one of your earlier statements: people often see what they want to see.

Happiest ending, as far as I’m concerned.

Thanks, Katharine & Barry, for seeding the conversation, and everybody else for disagreeing with each other seriously and respectfully. Out of 90 comments, I had to moderate only the smallest handful. I’ll look forward to following the conversation on your respective blogs, Twitters, newsletters, zines, etc. Comments closed.