Starting in early March 2003, more than a dozen math education experts were copied on the email exchange between Dr. Pendred Noyce, M.D., the head of the Noyce Foundation, and David Klein, Professor of Mathematics at California State University. The first message below to Dr. Ira Papick arose out of a discussion on the usefulness of mainstream mathematics education research, and was forwarded to Dr. Noyce.
The first message refers to a document entitled, "Show-Me Project Brief: Research on the Impact of Standards Based Middle Grades, Mathematics Curricular Materials on Student Learning" It was created by the NSF funded "Show-Me Project" in Missouri.
First Letter
Ira,
Your comment that our work should be research-based raises some difficulties. As Assistant Secretary of Education Grover Whitehurst said recently,
"The research in math is really in its infancy. What it provides in policy and practice is educated guesses."
That is a cheritable way to put it. The problem with mainstream
math education research is that it is of such low quality that it
is unreliable. It typically takes the form of "advocacy
research,"
for programs endorsed by the NCTM and this feature causes the field of
math education to more closely resemble a self-protective clique or
sect
than a viable field of inquiry. A few perspectives worth
considering
on this important isssue are:
Why Education Experts Resist Effective Practices
(And What It Would
Take to Make Education More Like
Medicine), by Douglas Carnine
http://www.scboces.k12.co.us/english/IMC/Focus/DirectInstruction-carnine_article.pdf
Mathematics Education Research
by Bastiaan J. Braams
http://www.math.nyu.edu/mfdd/braams/links/research0104.html
Can there be "research in mathematical
education?"
by Herbert Wilf.
http://www.math.upenn.edu/~wilf/website/PSUTalk.pdf
The document that you attached to your message illustrates some of the shortcomings in math education research. The article exhibits the usual "throw-everything-in-the-pot" set of references usually done by people writing polemically but trying to pass off what they are doing as a review of research. Anything that has something to say about mathematics education, even if it's not a real piece of research (many are essays, anecdotes, etc.), gets thrown in to beef up the research appearance (all those authors and dates in parentheses).
When one looks at the studies cited as showing evidence of student achievement from something other than short-term, small-scale studies (which usually translates as ethnographic or case study research--not experimental/comparison), there are no more than five:
1) the Riordan and Noyce study (2001)
2) Reys and Reys (2003)
3) Beaton et al (1997)
4) Griffin et al (2000)
5) Briars (2001)
Studies 4 and 5 are not peer reviewed (for what little extra that is worth in this field), and there are serious shortcomings at least in #5 (I don't know about the others).
One of several shortcomings of item 1) is that the schools studied are not identified. That makes it impossible to verify the results independently, thereby raising the possibility of fraud. This is a realistic possibility as the Noyce Foundation (headed by one of the authors of the study) has invested a lot of money in CMP, one of the programs found successful by the study. Clearly, that author has an interest in good results for the schools using the program she endorses. The editors of the journal should have asked for an independent confirmation of the methodology used to select both sets of schools and how they were matched before publishing the article. Instead, we get another example of "advocacy research." The comparison schools are constructed in a questionable way. The authors mix up all kinds of textbooks in the comparison groups--for half of which the authors report no curriculum program at all in the published article. No follow-up studies have ever appeared showing whether these schools maintained improvement and continued to improve in subsequent years of MCAS (200, 2001, 2002), which would be easy to do since there were only about 20 or so schools in the experimental group.
The most telling evidence against the Riordan/Noyce study, however, is the fact that despite growing use of NCTM endorsed math programs (financed by millions of dollars from the NSF), percentages of kids in the top two categories on grade 4 and grade 8 on MCAS have been stable since 1998. For 5 years, there has been no discernable increase in the percent of kids moving into the two top categories, based on a test that matches the NCTM reform agenda.
Let us hope that the quality of math ed research will increase sufficiently so as to actually help improve math education, rather than harm it, as it often does now.
David Klein
Dear Professor Klein:
Several people have forwarded to me copies of a message you
sent
out to
a group of mathematicians on Wednesday, Feb 26. Your
message
raises
questions about the study Julie Riordan and I published in JRME
in July
2001. Although you have never addressed these questions
to me, my
co-author, nor, to my knowledge, the editors of JRME, the issues
you
raise seem to me serious enough to warrant a public reply.
I wanted to alert you to the fact that your critique begins
with
an
erroneous statement. The Noyce Foundation has never
"invested"
in CMP.
We have not funded its development nor its distributors, and
obviously
we derive no financial benefit from its sales. In 1999,
responding to
district requests, we did begin to support the work of the
former
state
mathematics coordinator as she provided content courses and
mentoring to
teachers in districts newly implementing CMP. As we did
so, we became
concerned at the paucity of well-controlled studies of the
curriculum's
effectiveness. We wished to ensure that we devote the
foundationís
resources to helping school districts implementeffective
programs,
and
it seemed to us that enough time had passed since development
of the
materials that such a study would now be possible. Finding
that nobody
else was undertaking that work, we did so ourselves. We
thought that
the unsubstantiated rhetoric on both sides was a major
disservice
to
schools and communities, and we would have sought to publish
and act on
the results no matter what the outcome. Only after finding
that CMP had
a moderate positive effect on student achievement did we
establish
our
own program to provide mathematics coaches for under-performing
school
districts that had decided, on their own, to try this new
curriculum.
None of the schools that eventually received Noyce Foundation
assistance
figure in our study, because none of them had been implementing
the
program before 1999.
You raise concerns about the construction of our comparison
groups.
We
did not provide head-to-head comparisons of CMP or Everyday
Math against
other single defined curriculaóneatly matched schools
meeting such a
criterion did not exist in our Massachusetts sample.
Instead,
we
compared schools using the new programs to schools (carefully
matched on
demographics and past test scores) that were using the mix of
curricula
in most common use in Massachusetts at the time. We sought
only to
exclude other schools implementing what you call "NCTM endorsed
math
programs." As is common in American schools, and as
discussed
in our
article, the comparison schools were using a bewildering array
of texts
and locally-developed materials.
You express concern that our article did not identify the
individual
schools studied. I find your objection puzzling.
Confidentiality for
subjects is a staple of the medical literature and is commonly
required
by human subjects review committees in the social
sciences.
I do not
recall often seeing individual schools identified in the
education
literature. Perhaps you would care to share with the group some
examples
of studies you regard as exemplary in this regard. At
the time we
initiated the study, many school personnel expressed reluctance
to
participate, perhaps fearing controversy. To ensure a
full sample we
assured participating schools that results would be reported
only in the
aggregate. Surely our decision to honor that commitment
does not
warrant an accusation or suggestion of fraud. We are happy
to share our
primary data with school identifiers removed, and we have also
provided
anyone who asked with a detailed description of the methodology
and
sources we used to identify all the schools in the state
implementing
the curricula, as well as match schools not using them.
It would be a
relatively simple matter for another researcher to reproduce
our list of
implementers. By the way, I have had our files searched, and
can find no
record of a request from you for additional information about
the
study. Is it possible that a request went astray? Please
do not
hesitate to let us know.
You express interest in follow-up data. We have the same
interest,
and
next month at AERA we will be reporting a follow-up study on
the
Massachusetts CMP schools, the schools' performance on
subsequent
years
of the MCAS test, and their students' performance in high school
mathematics. As an extension of our earlier work, addressing
new
questions, this study should prove to be of even broader
interest
than
the regular update on school scores that you suggest.
Debates about methodology, choice of match groups, follow-up,
potential
confounding factors, and significance and applicability of
findings
are
appropriate to scholarly discourse. All researchers face
limitations
that come with studying change in the complex and multi-leveled
world of
schools. I welcome thoughtful critiques of our work,
although
I prefer
to receive them directly rather than second hand; answering
them often
helps clarify my own thinking. Ideally such discussions
should lead to
more and better studies being done. Indeed, they appear
to have led to
a new interest in funding and supporting prospective randomized
studies,
which should prove very informative.
But suggestions of scientific fraud can do real damage to
reputations,
and they typically lower the quality of debate about important
issues.
Moreover, reckless aspersions tend to obscure a position I think
we hold
in common: we need high quality mathematics education research,
and lots
of it. Letís ask ourselves and our colleagues to
take the high road in
exploring the issues, because the stakes--our childrenís
and our
nationís future--are so important to us all.
Sincerely yours,
Pendred E. Noyce, MD
Dear Dr. Noyce,
Thanks for your interesting letter in reply to my criticisms of your paper:
Riordan, J. E., & Noyce, P. E. (2001). The impact of two standards-based mathematics curricula on student achievement in Massachusetts. Journal for Research in Mathematics Education, 32(4), 368-398.The cc's on this letter include some people on the original list of your message to me.
Let me begin by responding to your question:
"By the way, I have had our files searched, and can find no record of a request from you for additional information about the study. Is it possible that a request went astray? Please do not hesitate to let us know."I have never sent a request to you for additional information about your study. Once a
In my statement, I wrote, "the Noyce Foundation (headed by one of
the
authors
of the study) has invested a lot of money in CMP, one of the programs
found
successful by the study. Clearly, that author has an interest in good
results for
the schools using the program she endorses." In response you
point out that:
"The Noyce Foundation has never 'invested' in CMP. We have not funded its development nor its distributors, and obviously we derive no financial benefit from its sales."I did not mean to imply that the Noyce Foundation made a financial investment in
Regarding concealment of the identity of schools you studied, you wrote:
"You express concern that our article did not identify the individual schools studied. I find your objection puzzling. Confidentiality for subjects is a staple of the medical literature and is commonly required by human subjects review committees in the social sciences. I do not recall often seeing individual schools identified in the education literature."A standard practice among medieval physicians was to bleed patients. Standard practice is not necessarily good practice. The standard practice among current education researchers of concealing the identities of putatively successful schools using controversial curricula is a practice that should be abandoned. It is one of many features of education research that makes the findings unreliable. I regard the ethical argument about confidentiality to be nothing more than a shell game. Yes, individual students must be protected, without a doubt. But test scoresfor entire schools, and even grade levels within those schools, are a matter of public record. In Massachusetts, where you did your study, school scores are printed in the Boston Globe and widely publicized. There is nothing confidential about school test scores, except among education researchers.
Concealing the identities of successful schools that use NCTM style curricula serves only to conceal questionable research practices, including possible fraud. Even experimental physics is not immune from fraudulent research studies, but that field has far better checks and balances than education.
In the case of your study, how do we know that the EM and CMP schools selected by CESAME and the publishers were not "cherry picked" in the sense that a handful of low scoring schools using EM or CMP were discretely dropped from comprehensive lists? Could that affect the findings of your study? Both CESAME (because of NSF grants) and the publishers have financial stakes in positive findings for EM and CMP.
To your credit, you acknowledge in your paper that the target schools using EM and CMP were predominantly high income and White. Given that, how do we rule out the possibility that the introduction of these curricula caused a significant increase in outside tutoring? The tutoring industry nationwide has skyrocketed with the introduction of NCTM math programs. Could it be that scores have improved in the high income schools you studied, not because of EM and CMP, but in spite of EM and CMP, on account of a drastic increase in private tutoring or after school support? Such questions are conveniently out of range to outside investigators because of "confidentiality." Promoters of NCTM curricula such as EM seem to favor implementing their programs in high income areas. Could it be for this reason? If you revealed the names of the schools in your studies, issues like these could be investigated in an open way. So why not clear the air?
I am regularly bombarded by email messages with pleas for help from parents who are desperate to fill in the gaps left by NCTM math programs. Both EM and CMP are major sources of complaints. Parents often resort to private tutoring. Two years ago, I was asked to give a talk about math programs to a parents' group in a high income region in the L.A. area called La Canada. The school district had extremely high test scores, and I was surprised not only by the request for me to speak, but also by the urgency of the request. I asked the parents what they were so worried about, given their top scoring schools. The answer from the approximately 50 parents I spoke to was that they were paying through the nose for tutoring. That was why their scores were so high. The tutors were using programs like Saxon math for elementary school to compensate for the NCTM endorsed programs that La Canada schools were using. Similarly, in Palo Alto, following the introduction of NCTM aligned pedagogy in 1994, Bill Evers reports that:
"Palo Alto School
District
parents are sufficiently discontented with the district's
math performance that in
massive numbers they are resorting to outside math
tutoring programs.
Forty-eight
percent of parents report providing outside help in
math for their children
(in the middle schools, this number rises to 63 percent).
The math-basics group HOLD's
own informal survey of the best-known
commercial math programs
shows that Palo Alto parents are spending at least $1
million a year for math
tutoring." http://www.csun.edu/~vcmth00m/AHistory.html
Let me now turn to your comparison groups. The comparison schools in your study used 15 different elementary school textbook programs and 15 different middle textbook programs. Some schools were using district designed programs, and others were using various combinations, possibly differing from one grade level to the next. At the middle school level, about half the schools were using programs published by Heath, Houghton Mifflin, Addison-Wesley, and Prentice Hall.
Without knowing titles and authors etc., I'm not sure which books these are. But I happen to have a copy of Prentice Hall's "Middle Grades Mathematics: An Interactive Approach," Course 3 by Chapin et al, copyright 1995. Is this the Prentice Hall book referred to in your study? It is a book that I would wish on no one. Far from "traditional," it appears to be heavily influenced by the NCTM. The only traditional features are the hard cover binding and the existence of some exercises. There are small group projects, calculator use is encouraged, and each chapter begins with an "investigation," e.g., chapter 4 begins with "Making Mobiles," and this is followed by a "think and discuss" section that introduces algebra tiles for adding linear expressions of the form ax + b. The book is full of pointless color photographs, much like a web page. The effect is to distract the reader and discourage focused attention. The first chapter contains a "zoo" of pie charts, bar graphs, line graphs, box-and-whisker plots, and other favored topics of the math reform movement. In short, it is a book that embraces the NCTM agenda. If this is typical of what the comparison schools were/are using in MA, then what you are measuring is NCTM vs NCTM. The inevitable result is that the NCTM wins the race, but it is a snail race between defective programs.
Your findings strain credibility in other ways. According to the test scores you report, CMP results in superior performance in every category on the 1999 MCAS but one, which you call "short answer." Now recall that the first edition of CMP (the one in use for your study) was radically deficient in its treatment of fraction arithmetic, and had absolutely no material (literally NOTHING) on division of fractions for any of the grades 6-8. Is fraction arithmetic tested on the MCAS? Are students asked by the MCAS ever to divide one fraction by another (without calculator assistance)? If the answer is no, then you have a serious problem in that your NCTM aligned test is radically defective. If the answer is yes, then we are forced to wonder how CMP students could learn to solve fraction arithmetic problems on tests better than their peers with almost nothing in CMP to support this achievement. Is fraction arithmetic also missing in the textbooks in the comparison schools as well? Or are CMP students learning it from private tutors? What is really going on?
Another question for you: Do you claim that all 21 CMP schools in your study do not teach separate Alg I courses in 8th grade? My understanding is that Algebra I is taught at least to some 8th grade students at almost all of the middle schools in the state, and surely the high income schools. Are your 21 CMP schools exceptions? If not, how are the Algebra I student scores factored into your study, if at all?
Sincerely,
David Klein
25 Mar 2003
Dear Professor Klein:
I recently received your letter of March 7. In reply, I would like to answer some of the particular questions you raise in your most recent letter, and then address the concern that appears to underlie both the letters I have read.
You ask how we are to know that the schools "selected" by the publishers and CESAME were not "cherry-picked" to exclude low performing schools. In fact, CESAME and the publisher did not select schools. We obtained lists from CESAME and the publishers of ALL schools that were known to be implementing the curriculum or had purchased materials. We cross-checked the list against a 1999 statewide curriculum survey. We then contacted the schools to determine whether they met the criteria for inclusion--which in the case of CMP meant that they had implemented at least 11 CMP units by 1998-1999. We were committed to including all the schools that were implementing the materials as their core program, precisely because we were concerned that studies of selected schools might present only a best-case scenario.
You ask how we rule out the possibility that the improved performance we observed came about "not because of EM and CMP, but in spite of EM and CMP, on account of a drastic increase in private tutoring or after school support?" We cannot rule out the possibility of increased tutoring with the data available to us. You relate an anecdote about a high level of tutoring in the La Canada region. It is much the same in my children's home district, Weston, MA. We have a high-scoring school district with a pretty traditional curriculum, where virtually all 8th graders take algebra. Large numbers of students either attend after-school programs like Kumon math or receive private tutoring. I am told that the going rate for tutors in our town is $80 an hour. Such anecdotes are interesting and may lead us to formulate hypotheses, but they are not by themselves otherwise useful. The parents attending the La Canada meeting you mention may not be representative, nor do we know the use of tutoring in that district in earlier years, or in similar districts using different materials.
However, your observations could lead you to a formulate a hypothesis that the performance gains seen in districts implementing CMP, EM, and other standards-based programs, or, for that matter, any curriculum, including Saxon, may be associated with an increase in tutoring. This is a testable hypothesis, and I would encourage you or others to carry out such a study to test its validity. The major difficulty would be getting reliable and comparable data on tutoring rates. Probably the best way to do it would be to find a region (state or district) that routinely surveys students about tutoring or after-school courses. Massachusetts does not currently do so.
You ask whether the MCAS includes questions on fraction arithmetic. On the 1999 test five of the forty questions were primarily about fractions, but there were no questions about division of fractions.
You ask how we factored Algebra I students into our study. We included in our analysis all regular education students who had been in the district for at least three years, without separating them by course title. In 1999 all the CMP districts reported using CMP as the core middle school curriculum for all students, and CMP units were used in every grade. We subsequently learned that some students in 4 of the 21 schools were placed into courses called Algebra I, where in some cases CMP units were supplemented with other materials. Nevertheless, these students met the criteria for inclusion-- they had used at least 11 units of CMP as their core middle school curriculum. The match schools, too, had numbers of their higher performing students enrolled in algebra courses.
In our follow-up study we are looking more closely at how the performance of students in CMP compares to that of students taking Algebra I, pre-algebra, or other courses. We also follow the subsequent course-taking career of the 1999 8th grade students in the CMP districts to investigate whether CMP provides an adequate preparation for higher level courses. At AERA my colleagues and I will also present a separate paper about grade 8 algebra across Massachusetts--who takes it, what its characteristics are, and how schools and students fare on MCAS as algebra enrollment increases.
Having attempted to answer your questions, I would like to
address
the key issue that I think underlies your comments. This is the
issue
of possible bias in our work. You state,
I did not mean to imply that the Noyce Foundation
made a financial investment in CMP
as a business venture; rather, I was suggesting
that the previous funding of CMP
programs by the Noyce Foundation naturally leads
the author of the study to have an
interest in good results for the schools using the
funded program. If you or others took
my meaning otherwise, I apologize for that.
One of the many deficiencies of education
research is that studies are almost invariably
conducted
by what might be described as
"cheerleaders" for the target programs under
study.
Your study is no exception.
Here you seem to be raising the issue of unconscious bias and its
detrimental
effect on results. Such bias is something we all have to guard
against,
and is one of the reasons that replication of research by other
scholars
is important. However, later in your letter, you again make the
suggestion
that what is really going on is fraud:
Concealing the identities of successful schools
that
use NCTM style curricula serves
only to conceal questionable research practices,
including possible fraud.
Fraud is a different matter. If you are accusing me of intentionally distorting results, fabricating evidence, or lying--or indeed some combination of these activities-- then it seems to me that further communication between us will not be fruitful. For example, if I chose to ignore my promise to schools and give you their names, what would keep you from suggesting that I was still lying, or that I had suppressed data from additional schools, or that I had carefully constructed our list of match schools to give the CMP schools an advantage?
At a certain point continuing to debate the methodology of a piece of research is less useful than an independent effort to reproduce the findings in either the same or a different setting. Of course, that takes an investment of time and effort and carries with it the risk that one will find what one does not expect. Building is always more difficult than tearing down. In view of that difficulty, and the importance of independent confirmation of findings, I was gratified to see that Reys and Reys (2003), looking at schools in Missouri, came to conclusions that were similar to ours.
Sincerely,
Penny Noyce
April 4, 2003
Dear Dr. Noyce,
In order to avoid a possible misunderstanding, let me first address your comments about the following paragraph that I wrote in my previous letter to you:
"Concealing the identities of successful schools
that use NCTM style curricula
serves only to conceal questionable research
practices,
including possible fraud.
Even experimental physics is not immune from
fraudulent
research studies, but that
field has far better checks and balances than
education."
I believe that you drew the mistaken conclusion that I was accusing you personally of fraud. You wrote, "If you are accusing me of intentionally distorting results, fabricating evidence, or lying--or indeed some combination of these activities-- then it seems to me that further communication between us will not be fruitful." Let me be clear. I am not accusing you of fraud, nor am I declaring that your article is free from it. Rather, I am indicting the standard educational research practice of concealing the names of schools involved in research. Without that critical information, independent verification is impossible, and, in general, questionable research practices, including fraud cannot be ruled out. This is of particular concern when the researchers themselves are strong advocates of the programs they are studying.
My skepticism about education research, and the skepticism of many others, does not originate in a vacuum. Education research has a long history of dubious findings and questionable practices. For example, there are literally thousands of education research publications that validate whole language learning for the teaching of reading (for some citations and further information, see the published article, "Sixty Years of Reading Research--But Who's Listening?" by Steve Zemelman, Harvey Daniels, and Marilyn Bizar on the Phi Delta Kappan web site at: (http://www.pdkintl.org/kappan/kzem9903.htm). This kind of "research" has contributed to serious shortcomings in the teaching of reading in the early grades.
You have not yet offered any justification for the practice of concealing the names of schools found to be successful by education studies, including yours, other than to say that anonymity is standard practice. Given the nearly universal positive findings in education research journals for NCTM-endorsed and NCTM-style programs, why should schools even be offered anonymity? They don't need it. They are virtually guaranteed praiseworthy findings from the investigations that will be accepted for publication in education journals. But even if the results of education research were not de facto pre-ordained, and there really was some variation in research findings, shouldn't school administrators take responsibility for the choices of curricula to which they subject their students? Whether you agree with that kind of accountability or not, the fact that in Massachusetts, where you did your study, school scores are published in the Boston Globe and are widely publicized, eliminates the argument that secrecy somehow protects students and teachers. The only point of this kind of secrecy is to protect "advocacy research" from the threat of transparency and from critical scrutiny. I urge you to abandon this practice in future studies you might undertake. If your findings really are based on reliable evidence, and your conclusions are sound, this should pose no difficulty for you.
I have some other suggestions on how you could have improved your article. The testing instrument, the MCAS, is largely controlled by the advocates of the NCTM vision of math education. In your article, you acknowledge, with approval, the strong correlation between the MCAS and NCTM style programs, near the end of your article, and there is a brief comment in the appendix that gives some statistical evidence that the MCAS favors the NCTM style programs. But you make no attempt to identify flaws of the MCAS and how that should affect interpretations of your findings. In contrast to your article, your letter of March 25 below forthrightly and commendably acknowledges that out of the five questions on fraction arithmetic on the 8th grade 1999 MCAS, none required fraction division. This deficit of the MCAS exactly matched a deficit in CMP (whose first edition completely omitted division of fractions and offered only a weak treatment of the other operations). What are some other deficits of the MCAS? Are there formula sheets provided with the MCAS so that students don't have to know simple, important formulas by heart? Does the MCAS allow the use of calculators? If so, how does that affect your ability to measure computational proficiency, an outcome of mathematics education that is almost universally embraced by mathematicians and others who use mathematics? Shouldn't a discussion like this be included in your section, "Limitations"? An objective researcher would make a stronger attempt to identify weaknesses in the testing instrument as a caveat to the findings.
Regarding Algebra I courses, you wrote:
"In 1999 all the CMP districts reported using CMP
as the core middle school
curriculum for all students, and CMP units were
used in every grade. We
subsequently learned that some students in 4 of
the 21 schools were placed into
courses called Algebra I, where in some cases CMP
units were supplemented with
other materials. Nevertheless, these students
met the criteria for inclusion-- they
had used at least 11 units of CMP as their core
middle school curriculum. The
match schools, too, had numbers of their higher
performing students enrolled in
algebra courses."
You state that you subsequently learned that some students in 4 of the 21 schools were placed into Algebra I courses. Are you claiming that there were *only* 4 schools that did this, or is it possible that some of the remaining 17 schools also did this, but they did not report that to you? High income schools tend to have high attendance in Algebra I in grade 8, and your CMP group is high income, so one might expect that more than 4 of the 21 schools offered Algebra I courses in 8th grade. This is one of many examples where the practice of concealing the names of schools is a impediment to scrutiny and verification of your results. At any rate, Algebra I is "traditional," so shouldn't the students in Algebra I in the 4 schools be included in the non CMP group, rather than the CMP group? Were there comparable numbers of students in the algebra courses in control schools and the CMP schools? The fact that some students in the CMP group were enrolled in Algebra I courses was not directly discussed in your article.
Regarding my comments on the effect that tutoring could have had on scores of the high income schools you studied that used EM or CMP, you wrote:
"It is much the same in my children's home
district,
Weston, MA. We have a
high-scoring school district with a pretty
traditional
curriculum, where virtually all
8th graders take algebra. Large numbers of students
either attend after-school
programs like Kumon math or receive private
tutoring.
I am told that the going
rate for tutors in our town is $80 an hour.
Such anecdotes are interesting and may
lead us to formulate hypotheses, but they are not
by themselves otherwise useful."
What is the "pretty traditional curriculum" for 8th graders, and what curriculum is used in the feeder elementary schools, and how long have those programs been used? This would be interesting to know. One way to minimize the effects of out of school tutoring on your data is to study schools in low income districts where few parents can afford tutoring. California has some impressive data for programs used in low income districts that are not NCTM style. Unlike typical studies that validate NCTM type programs, school names are not hidden on California data sets.
The Noyce Foundation is wealthy and powerful. It has the capability to retard or even harm K-12 mathematics education significantly. I urge you to break out of that pattern. Think twice about funding programs that a large number of mathematicians hold in low regard. Find at least one mathematician to consult with who is not a "true believer" in NCTM style programs. Find a mathematician who is well informed about the flaws and deficiencies of mathematics programs that you find attractive before you fund those programs In Massachusetts, Wilfried Schmid (cc'd) could give you excellent advice on this matter.
Sincerely,
David Klein