September 18, 2011

Seattle Lags Behind In Math and This Can Be Fixed!

One of the most frustrating aspects of working on the improvement of math education is dealing with an educational establishment that makes decisions based on fads and opinions rather than empirical facts.

Now, let us accept that there are different approaches to teaching mathematics, with a major divide between the "reform, discovery approaches" and the more "traditional, direct instruction" approaches.  Reform/discovery approaches became the rage among the educational community in the 1990s and I believe it is a major, but not sole, reason that math performance has lagged.

As a scientist, it would seem to me that the next step is clear:  test a variety of curriculum approaches in the classroom, insuring the class demographics are similar, and find out what works best.  In short, do a carefully controlled experiment with proper statistics and find the truth in an empirical way.  But what frustrates me is that such experimentation is virtually never done by the educational bureaucracy.  They seem to go from fad to fad and student progress suffers.  Reform math, Integrated Math, Teach for America, Whole Language, and many more.

Last Friday I and some other interested parties met with the head of curriculum and head of science/math for the Seattle Public Schools.  I do appreciate the fact that there were willing to hear us out.  But when we asked them about what plan they had for testing various math curricula and then proceeding with the most effective approaches, you would think we were from a different planet.   No plans to do such testing and a dedication to the "system approach", which appears to be arranging lots of tutoring and alternative classes when students run into trouble.  I would suggest it is better to stop them from getting into trouble in the first place.



Just maddening!  But the interesting thing is that some unofficial experiments with the use of more traditional approaches to teaching, ones based on direct instruction, learning of foundational concepts, and practice to mastery ARE occurring and the results are stunning.

Some examples:

The Seattle Public Schools use reform/discovery math at all levels (my opinion...a disaster).  Schmitz Park Elementary got permission to  try Singapore Math textbooks in 2007 (traditional direct instruction).  Its students’ math scores soared; in 2010 the 5th graders had the third highest passing rate in the state on the state test, even though the school has no gifted magnet program.  North Beach Elementary began using Saxon Math in 2001.  Their scores rose dramatically and stayed high for years, until a new principal, who opposed Saxon, took over; then the scores plummeted.  That principal was replaced, and the scores are back up.  At Ballard High, teacher Ted Nutting's students' scores on the AP calculus test have for several years averaged far higher than those of any other school in the district--guess what he and the Ballard precalculus teacher doesn't use?  Discovery/reform math.

Or how about Seattle high schools?  Here are the scores from the Algebra I end of course (EOC) assessments in order of % of students getting free lunch (a proxy for economic status of students).   Now you might expect student scores to scale with the socioeconomic status of the students, assuming everyone got the same curricula right?  And that is generally true except for two schools: Franklin and Cleveland.  Franklin is the largest anomaly.  Well folks, a little research has found that teachers at Franklin have generally put the district-provided reform math books away and have taught the students using more traditional/direct instruction approaches...and the results are obvious.  Cleveland has double-length math classes.   Can you imagine if we had double-length math classes plus good curricula in all Seattle schools? 



What  about in other districts?  Consider Gildo Ray Elementary in Auburn, which switched to Singapore Math -- Math in Focus (traditional texts) for the last two years from Everyday Math (which Seattle uses).  The pass rates on the MSP Math Exam in 2011 for grade 5 jumped to astounding levels:

Low Income : Black : Limited English Pass Rates in the 3 Columns
47.2% : 39.3% : 23.2% : State
44.8% : 33.8% : 26.0% : Seattle
88.5% : ..n/a… : 85.0% : Gildo Rey elementary in Auburn
I could give you many more examples. But the bottom line is clear:  a large number of informal experiments have shown that direct instruction approaches with an emphasis on mastering basic facts and practice to mastery produce widely better outcomes for our students, and the educational bureaucracy doesn't seem to care.  Why?  Because the educational business is far more interested in theoretical ideas and "social justice" than empirical proof.  And Schools of Education are more a part of the problem than the solution.

It is really so sad.  Replacing the curriculum and books is relatively inexpensive and easy compared to most other changes and could be done quickly.  Don't get me wrong, poor student performance has a multiplicity of causes, from too large classes, overworked teachers, teachers without sufficient subject mastery, student poverty, lack of home support, and many more.  But curriculum improvement is a low-hanging fruit that we should grab. The money could be found for new books.  How many parents would be willing to contribute for a new textbook to insure their child had a chance for a future?  Or might the Gates Foundation contribute to textbooks instead of the valueless Teach for American boondoggle?   God knows the huge sums wasted by the Gates Foundation and Microsoft on ill-considered experiments that have generally done more harm than good.

In short, there is very strong evidence that a change of curriculum from reform/discovery/fuzzy math to direct instruction approaches with an emphasis on basic facts/mastery could greatly improve the math performance of all students.  I challenge the educational establishment to do the robust experiments that will prove or disprove this statement. Sadly, I suspect they won't.  That is why we need new school board members in Seattle and in other districts that will ask for a more rational approach to curriculum acquisition.

23 comments:

  1. Clearly too obvious to be considered by those in "Curriculum Planning!"

    ReplyDelete
  2. Thanks for taking the time to meet with SPS curriculum people. There are a few other ways to improve math right away. Singapore Math has been approved as a supplement to Everyday Math, but few teachers use it and I'm not sure if there are enough books for all students. Teachers should be using this and sending the workbooks home with every elementary student. Also, we should replace MAP with ALEKS. ALKES is an online learning program that continuously assesses individual progress and brings students to proficiency in the most cost effective and efficient manner. It's the student placement test for WA state colleges and used in over 600 schools and districts in WA as well as many other states. It's a much better assessment tool than MAP, fills all the gaps in our math curricula, and can be used by teachers to differentiate instruction from students who are years behind to those who are years ahead. And it's very well researched.

    ReplyDelete
  3. In this scientific method flowchart, we never can conclude that the original hypothesis could be wrong, we just keep retrying the experiments until they give the result we want. Forgive me if I'm being dense, but are you saying the school district is doing this kind of mock science?

    ReplyDelete
  4. Keep up the good fight!

    My sister in law is an elementary school teacher in King County and when I broach this topic see looks at me as if I just grew a second head.. sad

    ReplyDelete
  5. I noticed a small omission in your chart, and wanted to bring it to your attention, since it is also part of the scientific method, a part which seems to be sadly lacking in a lot of "scientific" work today.

    The issue is around the "Faulty Experiments?" node -- what if the experiments were NOT faulty and the hypothesis was incorrect? In that case the experiments and observations don't -- and can't -- support the hypothesis, and you can then draw the conclusion and report that the hypothesis was incorrect. If you never assume that the hypothesis could be incorrect, it's becomes easy to eventually design a set of experiments that will "prove" the faulty hypothesis.

    ReplyDelete
  6. The debate on the merits of discovery vs. traditional math curricula goes way further back than the 1990s. A head-to-head comparison of the two approaches, with student achievement as the dependent variable, will not end the debate. Proponents of discovery methods argue that under traditional approaches, students' "understanding" of math concepts is rote and superficial and they don't truly learn "at a deep level." How one goes about proving this objectively and convincingly is another story. This, like many other debates in education, is an endless conundrum, because: 1)there is no consensus about how one objectively defines learning and achievement in math or other subjects, and 2)in practice, some curricula approaches will always work better for certain students than for others.

    ReplyDelete
  7. Sadly, my oldest is a victim of "integrated math." She has spent hard cash relearning algebra II, Trig, and pre-calculus at community college trying to learn skills she should have had. In our district, only honors kids got "traditional" or direct instruction. I wrote to legislators earlier this year asking them to stop setting new fad standards and give the ones they have set a chance to work. Our educational system is very frustrating.

    ReplyDelete
  8. Cliff and others, you might be interested in an old but still-revelant article that I and a group of science-based educators published two decades ago to summarize what we called "rights to effective education" that empirical research about learning and teaching had illuminated. Here's the link: http://www.fluency.org/Right_to_eff_edu.pdf

    ReplyDelete
  9. Along the lines of what Puffin addressed. How do we measure students grasp/understanding of material? Critical to an scientific and 'learning' approach to education is improved feedback and observation. Of the many things I hear complained about in education - bad curriculum and standardized testing are two of the most frequently hit. I find myself agreeing with most of your statements and frustrated with education but so much seems stymied in opinion and conjecture. Have we ever been good at education and what is our final metric? Papers published? Success in college? Leaders? Followers? Innovators? I know quantitatively that I've worked with engineers that you know KNOW it and ones that you lack confidence in. Can we only judge this one on one? Then standardized testing falls apart. How comfortable are you basing theory on standardized tests and is there a counter to your numbers (different metric that is benefited from discovery math)?

    ReplyDelete
  10. Just want to clarify - I teach at Gildo Rey. We don't actually use Math in Focus in the tested grades. We use the parts that apply to us. But we do use direct instruction methods with intensity and data driven interventions. No curriculum will solve the problem in math education. Good math TEACHERS and a direct instruction approach are what makes the difference. Thanks for recognizing our hard work!

    ReplyDelete
  11. Two of the educational institutions I hear most criticized are standardized testing and science/math curriculum. Is a standardized test the appropriate tool to measure which teaching methods are effective...and effective at what? Creating innovators? Life long learners? Leaders? Followers? I think for people that are hands on in the world of math and science it is obvious that something is wrong with our K-12. As an engineer I've worked with colleagues that KNOW their stuff and colleagues that have been to the classes. I support the approach of learning/studying/experimenting how to teach and transfer our pool of knowledge. Have good observational tools is critical, and sometimes it just seems like the only way to know is real world tests (ie if they don't understand 'x', they won't be able to produce 'y') and a more qualitative one on one. How do we bring this into the equation and how do you respond to people that say kids are getting a better 'understanding' of the material? I believe the real world currently supports the tests in saying kids are not understanding better but I don't have any numbers.

    ReplyDelete
  12. There actually has been a study on the efficacy of Direct Instruction. It's called Project Follow Through and was conducted from 1967 through 1995 in a variety of schools all over the US.
    http://pages.uoregon.edu/adiep/ft/grossen.htm
    The results are clear that DI is a far better method (especially for poverty kids) than the "Discovery" method.
    And yet, we don't use the results. America keeps pushing bad methods and bad textbooks. It's almost as if the people at the top were bad at math and now they want the rest of us to be bad at it as well. Makes them feel better.

    ReplyDelete
  13. At iCoachMath.com we promote Practice. The only way to have concepts ingrained in your brain is through Practice. You may help the child understand a concept, but how do you get them to retain this - PRACTICE PRACTICE PRACTICE...

    ReplyDelete
  14. At iCoachMath.com we promote Practice. The only way to have concepts ingrained in your brain is through Practice. You may help the child understand a concept, but how do you get them to retain this - PRACTICE PRACTICE PRACTICE...

    ReplyDelete
  15. Thank you for continuing to press the local schools on this issue. I am a product of the Integrated Math of the 1990s that was rampant among my large rural district in Pierce County and later Thurston County when I was in HS. I struggled in AP Calc when we used real college books, struggled in Math 124/5/6 and wound up in Econ. Not to say Econ was a bad major, but making charts to show the engineers are on budget is a far cry from actually being an engineer which is what I wanted to be when I started at UW. I hope that other local students will have a better foundation than I did so they won't have to doggie-paddle their way through the "weeder" prereqs at UW.

    ReplyDelete
  16. Big bird migration last night, captured nicely on radar.

    ReplyDelete
  17. Not to suggest that anybody's doing anything overtly wrong or corrupt, but I'd be reassured if school administrators published detailed diaries or some other record of their choice process when selecting texts, including interactions with publisher's representatives. A lot of money is spent on textbooks, between $75 and $120 per year per student in high school. I can imagine there's considerable steady pressure to buy new texts.

    Are the refresh rates on DI maths texts lower than D/D books? When I went to high school (yes, it was uphill both ways, freezing snow on the way there and scorching fire on the way back, every day) my geometry textbook included little in the way of dating clues; there were no comic strip reprints, no photos of children wearing recent fashions, etc. The layout could have been produced anywhere in a 20 year span of time. How about D/D texts? Is their shelf life less? Is this a subtle impetus to replace texts more rapidly? Other than presentational style, does anything in mathematics change at the secondary level sufficient to warrant frequent wholesale replacement of books? What's the impact on product cost from constant production refinement on what's essentially a static subject?

    Are the latest tweaks to "learning styles" simply the equivalent of new chrome on cars, planned obsolescence? In other words, is that 15th math textbook edition really necessary and how does pedagogical utility prioritize next to the need to preserve and build market share?

    Here's an assertion from our authorities on instructional materials that is quite remarkable when viewed through a mathematical lens:

    "The need for current, up-to-date instructional materials is paramount. Newer materials contain more accurate information and incorporate the most contemporary pedagogical approaches." (from Washington Learns instructional materials report )

    Really? Triangles are recently discovered to have four sides, not three? Who knew that mathematical facts are so fluid and transient?

    ReplyDelete
  18. Hopefully someone will listen before it's to late. Until then I guess we can scour colleges from overseas to get our scientists, designers and engineers.

    ReplyDelete
  19. You suggest an experiment designed to test the outcomes of the various curricula. As a research scientist (for 28 years)frustrated with outcomes of science education, I joined a group doing just that at a university in the in the northeast. It is not as easy as it seems. Our first experiment, despite supposedly matched samples, resulted in pretests that were much higher in one set of schools, with one curricula, than the other, with a different curriculum. And the post test showed no change in either set of schools. This represented years of work and this type of result is not that atypical. Human beings are a difficult group to study and educational researchers do try harder.

    ReplyDelete
  20. Here are the results for students in each Seattle Comprehensive high school that took an algebra class last year and then the State's End of Course Algebra test.

    First is pass rate for All students
    followed by same for Low Income students
    and then Percentage of Low income students scoring Well Below Basic.

    all :: LowIncome :: clueless low Income

    53.2% :: 42.0% :: 35.2% == Ballard
    46.1% :: 43.7% :: 38.2% == Cleveland
    30.4% :: 23.2% :: 49.3% == C. Sealth
    53.9% :: 56.6% :: 21.4% == Franklin
    30.7% :: 29.9% :: 47.7% == Garfield
    36.0% :: 17.9% :: 45.7% == Ingraham
    56.4% :: 31.7% :: 43.1% == Nathan Hale
    7.4% ::::: 8.5% ::: 63.9% == Rainier Beach
    71.8% :: 59.4% :: 20.6% == Roosevelt
    35.5% :: 28.2% :: 45.0% == West Seattle

    ReplyDelete
  21. There is a solution to the problem of math education. A former client, iLearn.com, has created software that is based on differentiated instruction. A series of pre-tests assesses what an individual student knows and doesn't know about math. As a student moves into the middle grades there may be serious gaps in math knowledge. The software can identify these gaps long before a human teacher can make such an assessment. The software delivers instruction to close these gaps. It is mastery based. The student doesn't move forward until each unit is mastered.

    Visit the website and look at the iPASS software demos and the results achieved in schools using the software. The lowest performing students actually catch up to grade level and even move ahead.

    I recruited a national organization to conduct the kind of research study Dr. Mass calls for. This organization has a goal of testing new ideas and curricula. Out of thousands of schools only three agreed to test the software and all three schools found reasons to abandon a scientific study.

    It is possible to demonstrate the dramatic results achieved by the few schools using this method. What I have found -- and so has the company -- is the massive indifference to success among our "professional" educators.

    I can arrange a go-to-meeting web demo for all who are interested.

    If educrats aren't interested, it can be used over the web by individual students if parents subscribe.

    ReplyDelete
  22. Sadly, my daughter was a victim of Core Math, which is the same kind of "group therapy"-style instruction. She has had to take remedial math in college, although she could do double digit multiplication in her head in the second grade (before the new math program was put in place). When all is said and done, I think the real reason districts go with these "touchy-feely" programs is to say they are doing something about improving math scores. Math book publishers push these programs to sell more books and related materials. After all, how much has changed in Algebra in the past 20 years? It's about the money: school district management want to keep their jobs by seeming to do something about math scores, and publishers want more profit. Unfortunately, they make the money at the expense of the kids and the society in which they grow up to live and work.

    ReplyDelete
  23. I think my daughter, who went to Issaquah and Bellevue schools (graduating in 2009) took math by the "discovery method" because it took her many years, even into Jr an Sr high, to know with some degree of reliability the simple 12x12 multiplication table, something we mastered by 3-4 grade back in the 60's.

    ReplyDelete

Please make sure your comments are civil. Name calling and personal attacks are not appropriate.

Tomorrow's Windstorm in Four Acts

 Each atmospheric "play" is different and according to high-resolution forecast models, I can describe the four "acts" t...