Public Comments on the CMF
Brian Conrad, Professor of Mathematics
Director of Undergraduate Studies in Math, Stanford University
(Posts #1-#8 are on the CMF version 2, posts #9-#10 are on the July 2023 Board-approved version 3, post #11 is on the October 2023 revision, post #12 is on a formal complaint making many false claims about me and this website)
Background
Introduction. High school students and parents deserve transparent, honest information about math skills required to earn degrees in quantitative and STEM fields -- including in data science. It cannot be gimmickry with courses that suffice for college admission but leave students mathematically unprepared for their desired goals at a higher-education institution.
The California Math Framework (CMF) must be transparent and accurate in the guidance it provides on college preparation. As Director of Undergraduate Studies for Math at Stanford since 2013, I felt a responsibility to look into these matters. (The picture at the top illustrates the Singular Value Decomposition, a fundamental point of contact between data science, algebra, and calculus.)
On this website I am posting most of my "public comment" submissions to the CMF revision process, based on completely reading both drafts. The detailed linked documents include substantial concerns alongside less significant ones because I am doing this entirely by myself and it is exhausting enough to document everything for submission. Hopefully the executive summaries for each post give a more appealing version of each. Here are links (in reverse order of the postings below) to my detailed comments on number sense and K-8 guidance, avoiding acceleration via wishful thinking, equity, technology, assessment and problematic placement, high school pathways, data science in the CMF, and citation misrepresentation in CMF.
See also the related blog posts 1 2 3, an open letter on K-12 mathematics, and a statement on mathematics in data science.
[...]
History. I first heard of the CMF in Summer 2021, and was told it proposed to integrate data science into the high school math curriculum. I knew how the two subjects interact, since I was part of a group of Math faculty that spoke with colleagues in many fields to overhaul Stanford's largest intro-level math class some years ago, aiming to make the modern applicability of fundamental math more transparent. That was a big success. The new course text we wrote was embedded into the website of the biggest machine learning course here, and one of the co-authors promptly won a Fields Medal (though, oddly, his citation didn't mention the new course text).
I read the entire Summer 2021 CMF draft to see how the synthesis would work at the high school level. Unfortunately, I saw that rather than developing data science in depth, the CMF promotes "data science" courses as a detour around algebra. But college degrees in statistics and data science at public universities in California require the missing high school algebra and also require college-level calculus and linear algebra.
Even UC Berkeley's very popular introductory Data 8 course with almost no prerequisites requires the missing high school algebra. Why promote a "data science" route that reduces preparedness for data science degrees, without clearly stating what has been avoided but will have to be learned later?
Over the next several months, I found more puzzles:
In California high schools, data science courses that teach no new algebra from the state content standards are widely offered yet are approved by the University of California (UC) as "validating" the Algebra II admissions requirement. Such validation is bogus, and UC's own regulations (see items 424.A.3(c), esp. 428C) and policies (see the three items at the bottom of page 1) seem to forbid it.
The missing algebra implies that such data science courses (followed, under common guidance, by a statistics class) leave students unprepared even for a degree in data science (including at UCLA, whose Statistics department is the home of the most widely-taught high school data science course).
UC area-C loophole. Eventually, I figured out the gimmick. AP Statistics requires Algebra II as a genuine prerequisite, so AP Statistics has been UC-approved to validate knowledge of Algebra II. Because of that, the UC admissions office (see slide 9 here) wrongly allows anything called Statistics -- regardless of its level of algebra content -- to "validate" knowledge of Algebra II. This is even the case when such courses have no Algebra II content, nor Algebra II as a genuine prerequisite. So the floodgates open: courses whose math content is only statistical are approved to "validate" Algebra II. This is a gross distortion of the purpose of validation.
The UC area-C loophole is now used by the CMF to justify its data science advocacy circumventing Algebra II. Then, in a feat of circular reasoning, the CMF (despite not yet being approved!) is cited by the UC admission office to justify its own data science advocacy circumventing Algebra II (see slide 13 here) .
Students, parents, teachers, and policy-makers must be told the truth: 4-year college degrees in data science, economics, statistics, computer science, and other STEM fields require core Algebra II skills to be learned. Such material is not taught at UC's, and occupies crucial course time early on at many other colleges (making completing the degree in 4 years very difficult; e.g., requiring summer courses that interfere with internships).
Public Comment #1: Key Math Ideas Promoting Success in Quantitative Fields
High School Math Topic Guidance
In quantitative and STEM fields (including economics, data science, and computer science), it remains important to be able to: work fluently with variables, reason from precise hypotheses, and use functions represented in many ways. The proliferation of new math courses in high schools risks confusing students, parents, and teachers about the math skills needed for contemporary college degrees.
[...]
The first draft of the CMF did not provide clarity about this, so Rafe Mazzeo (the Stanford Math dept. chairman) and I approached Stanford colleagues across all STEM fields (including data science & statistics) to collect the essential math skills (prior to calculus) for readiness across quantitative college majors. There was a remarkable degree of similarity in the responses we received. In collaboration with Patrick Callahan, we organized that feedback into a document submitted for public comment in early Fall, 2021.
Our submission appears in the second draft of the CMF, at the end of that 900+ page document (rather than in Chapter 8 on "High School"). Our Introduction, on the purpose of high school math education and its relation to data science, was replaced with part of a streamlined introduction written for posting on the Stanford Math Department website. The original submission is linked above, and I hope its introduction is useful too.
Public Comment #2: Citation Misrepresentation
When I read the new CMF posted in mid-March, I encountered a lot of assertions that were hard to believe and were justified via citations to other papers. So I read those other papers. To my astonishment, in essentially all cases, the papers were seriously misrepresented in the CMF. Some papers even had conclusions opposite to what was said in the CMF.
A detailed discussion of the many misrepresented citations is in this linked document Below I provide a version of the "executive summary" at the start of that document.
[...]
Executive Summary
The current draft of the CMF is a 900+ page document that is the outcome of an 11-month revision by a 5-person writing team supervised by a 20-person oversight team. As a hefty document with a large number of citations, it gives the impression of being a well-researched and evidence-based proposal. Unfortunately, this impression is incorrect.
I read the entire CMF, as well as many of the papers cited within it. The CMF contains false or misleading descriptions of many citations from the literature in neuroscience, acceleration, de-tracking, assessments, and more. (I consulted with three experts in neuroscience about the papers in that field which seemed to be used in the CMF in a concerning way.) Sometimes the original papers arrive at conclusions opposite those claimed in the CMF.
Just to be clear: I am not questioning the value or correctness of any of the cited papers. Rather, I am pointing out how such papers are invoked in the CMF for conclusions far-removed from the papers themselves.
The 7 areas of concern (corresponding to sections of my more detailed document on this matter) are:
Neuroscience via Pseudoscience
Devaluing of Advanced High School Math
Myth from 1892
Misrepresentations About Acceleration
Misrepresentations About Tracking
Misrepresentations About Assessment
Miscellaneous Further Misrepresentations
Showing a citation is falsely described in the CMF does not mean a specific position is being advocated with respect to that citation. (There are other statements on content-based concerns with the CMF.) My purpose here is solely affirming that honesty and accuracy matter in public education policy. I found all misrepresentations and related inaccuracies when reading the CMF (and cited papers) on my own. I thoroughly checked everything in what I have written, but since I am one person duplicating on my own the work of a 20-person oversight team I know mistakes are possible; if you find any then please tell me and I'll gladly fix it.
Here are some examples of problematic citations to the literature (for details see the linked document; all documents cited below are listed in Appendix B of the CMF, its overall bibliography organized by chapter):
(i). The CMF contains many misrepresentations of the literature on neuroscience, and statements betraying a lack of understanding of it. For example, the CMF claims that the ``highest achieving people have more interconnected brains'' and that creating ``brain connections'' improves learning. But the relation between brain connectivity and performance, especially in mathematics, is much more complex. In particular, children with developmental dyscalculia exhibit abnormally high hyperconnectivity in several parts of their brain, contradicting the CMF's claims.
A sample misleading quote is ``Park and Brannon (2013) found that when students worked with numbers and also saw the numbers as visual objects, brain communication was enhanced and student achievement increased.'' This single sentence contains multiple wrong statements: (1) they worked with adults and not students; (2) their experiments involved no brain imaging, and so could not demonstrate brain communication; (3) the paper does not claim that participants saw numbers as visual objects: their focus was on training the approximate number system.
Many more examples are discussed in Section 1 in the linked document. The nature of the errors implies (as explained in the linked document) that whoever wrote these parts of the CMF lacks an understanding of the neuroscience literature regarding the learning of mathematics. Nobody on the CMF writing team (nor CFCC oversight team) has expertise in neuroscience, so it is understandable that they do not know this field. They should not be citing papers they do not understand to justify their public policy recommendations and guidance to districts. The neuroscience experts I talked with agree with a 2008 statement of the National Mathematics Advisory panel: ``attempts to make these connections [of neuroscience] to the classroom are premature''.
(ii). The CMF selectively cites research to make points it wants to make. For example, Siegler and Ramani (2008) is cited to claim that ``after four 15-minute sessions of playing a game with a number line, differences in knowledge between students from low-income backgrounds and those from middle-income backgrounds were eliminated''.
This passage, especially the use of the word ``eliminated'' and the absence of context, suggests a dramatic effect at many levels. In fact, the study was specifically for pre-schoolers playing a numerical board game similar to Chutes and Ladders and focused on their numerical knowledge. A number of subsequent studies by the same authors with more rigorous methods showed smaller positive effects of playing the game that did not eliminate the differences. Such games are among the worthwhile approaches to improving number sense, but it is misleading to present them as a magic bullet as the CMF does.
(iii). In multiple cases the CMF argues against acceleration by citing papers that either do not justify the statements or come to the opposite conclusion. The CMF claims Liang et al (2012) and Domina et al (2015) demonstrated that ``widespread acceleration led to significant declines in overall mathematics achievement.'' As discussed in Section 4 in the document, Liang et al actually shows that accelerated students did slightly better than non-accelerated ones in standardized tests. In Domina et al, the effect is 7% of a standard deviation (not ``7%'' in an absolute sense, merely 0.07 times a standard deviation, a very tiny effect). Such minor effects are often the result of other confounders, and are far below anything that could be considered ``significant'' in experimental work.
(iv). In yet another case, the CMF cites Burris et al (2006) for demonstrating ``positive outcomes for achievement and longer-term academic success from keeping students in heterogenous groups focused on higher-level content through middle school''. But the CMF never tells the reader that this paper studied the effect of teaching Algebra I to all 8th grade students (getting good outcomes) --- precisely the uniform acceleration policy that the CMF argues against in the prior point.
(v). In some places, the CMF has no research-based evidence, as when it gives the advice ``Do not include homework .... as any part of grading. Homework is one of the most inequitable practices of education.'' The research on homework is complex and mixed, and does not support such blanket statements.
(vi). The CMF claims that Sadler and Sonnert (2018) provides evidence in favor of delaying calculus to college, but the paper finds that taking calculus in high-school improves performance in college.
(vii). The CMF makes the dramatic claim that (Black et al, 2002) showed that students ``are incredibly accurate at assessing their own understanding, and they do not over or under estimate it.'' If this claim were true, no exams would be needed to assess student knowledge: we could just ask them. Unfortunately, the paper of Black et al contains nothing of the sort.
The abundance of false or misleading citations I found in the CMF calls into doubt the credibility of all citations to the literature in the CMF. It is the responsibility of the California Department of Education to fix all defective citations. If there is neither time nor expertise to confirm the accuracy of a citation then it has to be removed, along with everything that depends on it.
My grade for the CMF's accurate representation of the cited literature is F. In the spirit of the CMF's advice that teachers should ``always allow students to resubmit any work or test for a higher grade'', the state of California should get another opportunity to revise and resubmit to the public for a higher grade.
Public Comment #3: Data Science
This document consists of my comments on Chapters 1 and 5 of the second field review of the California Math Framework (CMF). Chapter 1 conveys the purpose of the CMF: who are the stakeholders and how they can benefit from the document. Chapter 5 addresses the overall topic of incorporating data science into the TK-12 math curriculum.
I wish to emphasize a fact of reality: data science undergraduate degrees require calculus, and data science jobs at Spotify, Salesforce, Amazon, Dropbox, Google, LinkedIn, etc. require degrees in fields such as math, economics, engineering, computer science, etc., and these require calculus and linear algebra. There are ``data analytics'' jobs requiring little math, but when all the work is running tools on spreadsheets with no guiding mathematical understanding then the job is on track to being automated. For more on that, see this statement.
A detailed discussion of my comments on Chapters 1 and 5 is in this linked document. Below I provide a version of the "executive summary" at the start of that document.
[...]
Executive Summary
Introduction. Some concerns with the 2021 CMF draft have been largely addressed in the second draft, but others remain. The CMF now exhibits the following general issues:
misrepresentations of facts and evidence, and of the field of data science;
lengthy passages having little to do with math content, graphics giving no information (an example is given above this Executive Summary, at the start of this post), and tables of vital information that are difficult to parse (e.g., a blizzard of codewords such as ``7.SP.5'' that are not defined in the CMF);
guidance either lacking details essential for implementation or seeming to be opinions rather than evidence-based (the interaction of data science courses with other high school math exhibits these concerns);
bias towards statistical topics (which the CMF insists on calling "data science") while minimizing the value of other math standards (such as algebra) upon which data science depends when pursued for degrees and careers.
Just to be clear, statistical and data science topics have great merit to be treated in the high school curriculum (and not only in math classes). Data literacy is an important component of daily life, and a data science course may reignite a student's interest in math. How these topics interact with coursework in a variety of fields is an important topic for public education policy. But the CMF is not treating these matters with the degree of care and detailed attention to downstream effects that is required.
Some guidelines have been prepared on data and statistical literacy , but that is very different from the formulation of math curricula. Math is a pervasive language and way of thinking that impacts preparedness in many fields, particularly those leading to lucrative and stable careers. The fact that data-oriented topics merit consideration in math curricula does not imply that all proposals on the matter are well-designed.
The universality of math makes its role in K-12 education like an iceberg: there are many parts of the educational infrastructure that are not readily apparent at the surface. So efforts to change the high school math curriculum need to involve stakeholders at all levels -- teachers, industry, and content experts in higher education across quantitative fields -- to ensure the needs of all who rely on mathematics are met without making existing challenges even worse.
Data Science. Chapter 5 ``Data Science TK-12'' promotes a bias toward data science relative to other areas of high school math, based on misinformation and hype. This violates the purpose of the CMF, which is a guide organizing the approved content standards from the State Board of Education (SBE). There are no such standards for data science, and none of the CMF writers has expertise in data science.
[As a professional mathematician who has spoken over the years with many Stanford colleagues, I understand in detail how math and data science are related and how math informs some important applications. I have also consulted with top-level experts in data science on the preparation of this document. If there is any mistake or discrepancy, please let me know and I'll gladly fix it.]
The CMF writing team has ignored the explicit warnings by SBE member Patricia Rucker that bias towards data science over other math routes is forbidden.
The following are the primary concerns with Chapter 5:
Statistics standards are hyped by calling them ``data science''. Tools such as clustering, modeling, data cleaning, and dealing with messy data are all traditional topics in statistics.
Misleading suggestions that data science does not depend on more advanced math standards when pursued for college degrees and careers, and a failure to clearly distinguish data literacy (useful to anyone) from actual data science (which depends on more substantial math).
The CMF's advocacy to skip Algebra II in high school in favor of data science and statistics, without being honest and transparent about the downstream effects (e.g., cutting off preparedness for 4-year UC-degrees in STEM, including in data science), alarms experts across California's colleges and universities.
Unfounded (and false) claims that data science is more equitable and overall more approachable and collaborative than other areas of math. This misleads districts and teachers.
Highlights from the review
(i). The CMF claims that in high-achieving countries there is ``very little variation across school or student groups''. This is false. Many high-achieving countries (e.g. Korea, Japan, China) put students into separate tracks in high school.
(ii). The CMF contains statements full of hype, such as ``The numbers are staggering: around 1.7 megabytes of digital data were created and stored every second for every person on Earth in 2020, and the vast majority of data goes unanalyzed ''.
While this statistic may sound impressive (indeed pops up when you Google, as presumably the CMF authors did, for ``impressive big data stats''), it is meaningless. The vast majority of bytes of data created by people are through images and videos. Do we know if 1.7 megabytes (which can correspond to a single high-resolution image) is a large or small amount? And do we really want all of that data to be ``analyzed''?
(iii). The CMF time and again suggests that data science is more equitable than other fields, as when it says ``data scientists work together to address uncertainty in data while avoiding bias.'' or ``Traditional mathematics lessons that have taught the subject as a set of procedures to follow have resulted in widespread disengagement as students see no relevance for their lives. This is particularly harmful for students of color and for girls [...] The data science field provides opportunities for equitable practice, with multiple opportunities for students to pursue answers to wonderings and to accept the reality that all students can excel in data science fields.''
All fields of math can be taught well or badly. All educators should object to the notion that students of color or girls cannot excel in mathematical fields other than data science. There is nothing inherently more equitable about data science than other areas of math.
(iv). The CMF confuses data literacy with data science. Data literacy is an essential skill for everyone, and it can be taught as part of many courses, including social and natural sciences, and even the humanities. Students should not be given the false impression that a data literacy course will prepare them for a data science career.
Yet the CMF says that in a `` high-school data-science class students can learn to clean data sets – removing any data that is incorrect, corrupted, incorrectly formatted, duplicated, or incorrect in some other way [...] High school students can also learn to download and upload data, and develop the more sophisticated “data moves” that are important to learn if students are tackling real data sets.'' These are data literacy skills of using spreadsheets and other software. They are devoid of mathematical content.
(v). The UC area-C loophole is invoked multiple times to promote data science as a UC-approved alternative to Algebra II, even though that approval is a violation of UC policies and regulations.
(vi). The CMF claims that ``With the rapid expansion of information available [...] far more students pursue statistics classes than calculus, and may be better served by a data science course as a culminating high school mathematical science experience. In addition to the importance of the data science content – to twenty-first century jobs and to a wide range of college majors.''
But the CMF is not supposed to prefer one math course over the other, and it is misleading students and teachers. The ``21st century jobs'' such as computer scientist, data scientist, or statistician require a STEM major in college. Such degrees require calculus, and recognize AP calculus but often not AP Statistics for major requirements. (It is briefly acknowledged that data science degrees need calculus in college, but generally Chapter 5 advocates courses deviating from such readiness.)
So a statistics course can be a fine option, but if given the choice then students seeking such careers are often better-served by a calculus course. (AP calculus could certainly be improved; I am only addressing the context for ``better-served''.)
Recommendation: Eliminate Chapter 5 and replace it with an appendix written from scratch by a group of recognized and disinterested content experts from industry and academia along with high school teachers.
As a precedent, in 2013 there was a desire to provide guidance on financial literacy, but the SBE-approved standards did not address that topic. The 2013 CMF therefore included an appendix on financial literacy, organized around a couple of standards documents from recognized authoritative bodies.
Such an appendix can also provide guidance beyond data science, such as on: introducing basic coding skills, the relation to computational thinking and computer science concepts, and math topics such as logic or discrete math that are preparatory for college-level computer science. It can also provide clarity about the distinction between statistics and data science, to the extent there is a distinction at the pre-college level. When there is no distinction, it should be called ``statistics'' to avoid unnecessary hype and confusion.
Given that I argue for eliminating the current Chapter 5, my detailed commentary on Chapter 5 in the linked document may appear to be moot. But that is not so: such commentary conveys many points of concern, driving home the necessity of removing this chapter and starting over with a new team of writers for it as an appendix, as described above. My comments on Chapter 5 are arranged linearly by the CMF text that they reference. The themes around which my Chapter 1 concerns are organized also encapsulate most of the concerns in Chapter 5, except that the CMF bias toward data science is on full display in Chapter 5.
Public Comment #4: High School Pathways in the California Math Framework
This document consists of my comments on Chapter 8 and Appendix A of the second field review of the California Math Framework (CMF). Chapter 8 gives an overview of high school pathways, and Appendix A provides the details on the content of those pathways. (It is unclear why the details of the high school courses are put into an appendix at the end of the 900+ page document.)
Within the high school curriculum (not only in math classes), there is great merit in treating topics from statistics, data science, and computational thinking. Data literacy is an important component of daily life, and a data science course may reignite a student's interest in math. How these topics interact with coursework in a variety of fields is an important topic for public education policy. The CMF is not treating these matters with the degree of care and detailed attention to downstream effects that is required.
A detailed discussion of my comments on Chapter 8 and Appendix A is in this linked document. Below I provide a version of the "executive summary" at the start of that document.
[...]
Executive Summary
Introduction. There are a variety of concerns in both Chapter 8 and Appendix A, but by far the two most significant ones concern the so-called MIC pathway and the UC area-C loophole. The latter is explained towards the end of the Background section on this website, so let me now introduce the MIC pathway.
The third pathway. One of the main proposals in the CMF at the high school level is a new pathway through the first three years of high school math (which may start in grade 8 for accelerated students). Currently there are two options: the ``traditional pathway'' consisting of Algebra I, Geometry, and Algebra II, and the ``integrated pathway'' that is cumulatively the same content but is a blend of algebra, geometry, and function topics (and some probability & statistics) within each of the three years.
The 2013 Common Core document lays out the content standards for every year of both pathways. Each high school in California offers one of the two pathways. (Across the country, most states are either all-traditional or all-integrated for their high schools.) A widely-recognized difficulty is that it is hard to cover the full range of topics in each year (due to the added statistical topics). The current process never addressed that: the SBE-approved content standards feeding into this CMF remain unchanged from 2013. The CMF makes no proposal for how to address this challenge (it could have tried, and chose not to do so).
Instead, the CMF goes off in an entirely different direction, proposing a third pathway called ``Mathematics: Investigating and Connecting'', or MIC for short. It is noted in the CMF that each high school will offer one of the three proposed pathways. The idea seems to be that MIC is an ``applied'' version of existing content, but details of how that would work are not given.
What is the MIC pathway? In the prior CMF draft, the premise of MIC was to make versions of high school math focused around applications in mathematical modeling or data science. The respective options were called MIC-Data and MIC-Modeling, but it was very confusing because there were essentially no real details addressing the many natural questions one would have (such as actual content and depth of coverage of applications). These were course names serving as empty vessels, and the absence of details made formulating concerns akin to grabbing a piece of Jell-O.
In the current CMF draft, MIC is a single pathway (with courses called MIC 1, MIC 2, MIC 3) and the first two years have a very clear definition: they're just Integrated 1 and Integrated 2 with new names. I am not kidding. The CMF doesn't say this explicitly, but if one looks at the actual content standards for each year, they're identical to those of the Integrated pathway (up to some more statistical topics in the 2nd year). Moreover, there is no discussion of details on incorporating actual modeling or investigating, or really anything new.
But seriously, what is MIC? You may be thinking to yourself: if MIC 1 and MIC 2 are just new names for Integrated 1 and Integrated 2 in terms of the content standards, then why not incorporate whatever novelty there is into the Integrated courses, and do the same for the traditional ones? Great question! The CMF offers no answer.
MIC 3 remains very obscure: there are no details of implementation discussed anywhere, in one place it is said to cover all the content standards of Integrated 3, in another place it is said MIC 3 ``is not a data science course'', and in yet another place it is said MIC 3 is organized around data topics. This is another piece of Jell-O.
The bottom line is that the MIC pathway proposal is not professional in terms of details and clarity. One of the recommendations I make is the elimination of all discussion of the confusing MIC proposal that is devoid of a clear purpose and a responsible level of detail.
My comments on Chapter 8 are organized around the following themes:
The mysterious MIC pathway
The UC area-C loophole
Chaotic guidance and violation of the Math Placement Act
Redundant passages and unhelpful figures
Proposals missing essential details and/or evidence
Lack of clarity on standards and math topics
Forbidden bias among pathways
My comments on Appendix A are listed linearly and are largely centered around the MIC pathway and the UC area-C loophole.
Highlights from the review:
(i). In lines 912-924 of Chapter 8, it is said that ``the MIC 3 course described below is not a data science course'' yet in Appendix A the paragraph on lines 760-769 contains the passage
``MIC 3 has many student investigations driven by data. Students generate questions, design data collection, search for available existing data, analyze data, and represent data and results of analysis. They use powerful technological tools to help with all of these tasks. Much of the content in all Content Connections is situated in stories told through data.''
that looks a lot like a data science course (especially given how loosely ``data science'' is used in Chapter 5). In fact, it is taken almost verbatim from a discussion of the 3rd-year data science course MIC-Data in the previous CMF draft (Chapter 8, lines 989-993 of that draft).
If we allow that MIC 3 might account for 3rd-year data science then the claim on lines 914-915 that MIC 3 meets the Common Core learning outcomes of Integrated Math 3 is false. Due to the absence of details, it is impossible to assess the plausibility of the claim that MIC 3 covers all learning standards for Integrated Math 3. Appendix A shows MIC 1 and MIC 2 are Integrated Math 1 and Integrated Math 2 with simply another name. Overall, MIC is a deeply problematic proposal.
(ii). Lines 997-1021 of Chapter 8 consist of an itemized list about data science courses and the UC area-C approval loophole. The UC area-C approval process has serious problems:
it is closed to public inspection,
the rubric it uses does not assess the actual math content (it is entirely performative),
the people who apply the rubric aren't required to have expertise in math: in practice they do not, and so they can't catch fake prerequisites (e.g., courses that claim to have an Algebra II prerequisite but don't genuinely use that content substantively as required by UC policies and regulations for an advanced/honors course or for validation; an example of such fakery is confirmed in this October 3, 2020 email in green on pages 5-6).
Parents and school districts in California see UC-approval as a gold standard of quality, but for area-C in connection with Algebra II it has degenerated into something that falls far short of expectations. A validation process created to address students who learn math outside the public schools has been distorted into endorsing learning less of the SBE-approved content standards.
Line 1003 of Chapter 8 notes that UCLA's Introduction to Data Science (IDS) class is UC-approved as an ``advanced/honors'' math class, but that approval is a violation of UC's own policies and regulations on such classes. Indeed, IDS's own material says it is approved to validate Algebra II yet that approval is devoid of meaning since the Common Core standards covered in IDS are entirely in statistics and so make no contact with the non-statistical Algebra II content standards (i.e., nearly all of Algebra II).
The mistaken impressions about IDS providing STEM-readiness are not limited to the defective Algebra II validation for the UC system. It has reached to the level of ten US Senators erroneously promoting this course to the NSF as a vehicle for STEM pipeline diversification (the course pitches itself as an alternative to Algebra II, damaging readiness for college degrees in STEM, including data science).
(iii). The vignettes in the CMF generally obfuscate rather than clarify. For instance, one on lines 542-686 in Chapter 8 is six pages long, going on and on and on with non-mathematical discussions rather than being succinct and focused. The previous CMF had all examples occupying at most 1 page (often less) because it focused on the math. The CMF is not a place for this style of unfocused writing.
After reading that entire vignette, it is unclear what is the new math knowledge students are learning. All examples have to focus on the math content, as prior CMF's did and this one sadly often does not.
(iv). There is evidence-free wishful thinking about compression courses. On lines 971-981 of Chapter 8, the CMF discusses a hypothetical non-accelerated student who spends Grade 11 in Data Science, which Chapter 5 shows to be essentially devoid of interaction with the non-statistical topics in Algebra II and Integrated Math 3. It is then claimed that this student can enroll and do fine in precalculus in Grade 12 by means of a ``half-semester support class''. This is a fantasy with no basis in realistic experience.
If it were possible then all students could go straight from Geometry or Integrated 2 directly into precalculus with a ``half-semester support class'' (whatever that is), and then the CMF would be shouting it from the rooftops as a solution to the thorny problem of accessing calculus in grade 12 from 9th grade Algebra I. This is wishful thinking, as demonstrated by the SFUSD nightmare for students that consists of numerous contortions of doubled-up classes and private tutoring etc.
As another example, on lines 786-791 of Chapter 8 the CMF casually makes a dramatic curriculum proposal (with very inadequate details) to fit the 4 years from Algebra I through pre-calculus into 3 years by ``eliminating redundancies''. Such a brief proposal cannot be taken seriously in the absence of carefully-explained details. The CMF is not an appropriate venue for ``if only'' speculations. It's entirely normal to need some repetition when learning mathematics. The phrase ``eliminating redundancies'' is an empty vessel in the absence of a detailed proposal.
(v). For the passage
``Such an assessment might be coupled with supplementary or summer courses that provide the kind of support for readiness that Bob Moses’ Algebra project has provided for underrepresented students tackling Algebra in middle and high schools for many years (Moses and Cobb, 2002).''
on lines 848-851 of Chapter 8, the CMF is recognizing Bob Moses' Algebra Project, his national effort to teach Algebra I in 8th grade as a vehicle for enhancing math and science literacy for all students.
But the CMF passes over in silence that his main push was for Algebra I in 8th grade where it uses the phrase ``tackling Algebra in middle and high schools''. It would honor Moses' legacy far better by proposing more detailed guidance on how to help more students achieve readiness for Algebra I in 8th grade.
As everyone on the CMF writing team, the CFCC oversight team, and the math IQC team surely knows from reading his cited book Radical Equations (as I did), Moses' approach involved helping kids in the earlier grades. On page 15 of that book, Bob Moses formulated a primary goal:
``all middle school students ready to do the college prep math sequence when they get to high school.''
He acknowledged that the definition of ``college prep math sequence'' varies from place to place yet viewed this standard as
``the floor, not the ceiling. We're not trying to put constraints or limits on what any group of students might learn.''
If the CMF would put the same level of effort into laying out Moses' approach that it does on advocating versions of data science that will be an off-ramp from preparation for 4-year quantitative college degrees then that would align with the mission to which Bob Moses devoted the last 35 years of his life.
(vi). In multiple places in Appendix A, the CMF refers to an SBE-approved May 2021 document on digital learning guidance that speaks about a version of the CMF that no longer exists. For example, that document refers to a MIC 3 that has nothing to do with the MIC 3 in the new CMF.
How could the SBE approve a document in May 2021 whose Section B refers so much to a CMF that at the time was not approved (and has since undergone huge changes)? That Section B has to be revoked.
Recommendations: (1) Eliminate all discussion of the MIC pathway, (2) eliminate all mention of UC area-C approval, (3) put the details of high school pathways into the main text (not buried out of sight in an appendix at the end, as is presently done), (4) revoke Section B of the May 2021 Digital Learning Standards Guidance.
Public Comment #5: Equity, Technology, Assessment, and Problematic Placement
This document consists of my comments on Chapters 10, 11, 12, and 13 of the second field review of the California Math Framework (CMF).
Chapter 10 discusses approaches to equity in K-12 math classes, Chapter 11 treats the role of technology in math classes (including distance learning), Chapter 12 is about assessment of mathematical learning (homework, exams, etc.), and Chapter 13 concerns the criteria that creators of course material (``publishers'') must meet to be approved for statewide adoption (for grades K-8).
My detailed comments are provided in this linked document. Below I provide a version of the "executive summary" at the start of that document.
[...]
Executive Summary
Introduction. In the 2013 CMF, the analogues of the current Chapters 11 and 12 were a fairly modest 10 and 13 pages respectively, while the analogues of Chapters 10 and 13 were respectively 36 and 25 pages long. Now the length of Chapter 13 on publisher criteria has become shorter (only 22 pages) whereas the length of Chapter 12 on assessment has exploded (a whopping 70 pages).
In the 2005 CMF, the chapter on assessment was a mere 5 pages long. What could make the discussion of assessment grow so dramatically (from 5 to 70 pages) over two CMF cycles? The primary reason is the inclusion in the current CMF of extensive opinionated discussion opposing homework and a variety of standard assessment practices. These opinionated discussions have to be removed.
The writing of Chapter 12 suggests its guidance is evidence-based, but that is false. My extensive comments elsewhere on the overall citation misrepresentation in the CMF have many entries for Chapter 12 corresponding to the opinionated discussions noted above; I won't repeat those here. The greatest concerns among Chapters 10-13 are in Chapter 12, so let me first briefly address Chapters 10, 11, and 13:
In Chapter 10, there are proposals that disrespect the time burden on teachers and assertions that indicate a misunderstanding of the universal nature of mathematical content. It also promotes violation of the Math Placement Act in multiple places.
In Chapter 11, the discussion of distance learning has almost nothing to do with math, and so it should all be removed. There is a separate SBE-approved May 2021 document specifically about distance learning, but its Section B on math has to be revoked due to extensive reliance on an earlier (not yet approved) CMF draft that is very far-removed from what now exists. The replacement for that Section B is where all discussion of distance learning for math should be placed. Once that is done, Chapter 11 will be half as long and its remaining concerns easy to fix.
In Chapter 13, the concerns center around several things that occur elsewhere in the CMF: data science hype as in Chapter 5, denigration of the importance of content standards as in Chapter 1 (despite legal obligations to the contrary that are also noted in Chapter 13), and denigration of the importance of procedural skills as in Chapter 12.
Assessment concerns. Finally, we turn to Chapter 12 on assessment. The main concerns fall into three types:
(i) Irresponsible and baseless denigration of the importance of procedural fluency alongside conceptual understanding.
(ii) Advocating for the elimination of math homework and the wholesale use of subjective measures of learning (such as portfolios) in place of objective measures, thereby reducing accountability to external authorities.
(iii) Promoting violations of state law concerning course placement.
A sample illustration of (i) appears on lines 53-57 of Chapter 12, with the passage
``It has long been the practice in mathematics classrooms to assess students’ mathematics achievement through narrow tests of procedural knowledge. The knowledge needed for success on such tests is far from the adaptable, critical and analytical thinking needed by students in the modern world.''
This promotes a cartoon view of how reliable mathematical skills for applications are acquired. It is on par with opposing that children should learn how to spell because there are spell-checkers and spelling is not part of analytical thinking.
The writing of the CMF did not involve collaboration with recognized STEM experts in industry, and its content advocacy at the high school level has caused serious concern among STEM experts in higher education who provide foundational training for jobs of the future. The CMF has to be more balanced in its discussion of the equally important fluency in procedural skills (to a non-tedious level) and conceptual understanding.
Turning to (ii), the advocacy against math homework is discussed at length in the comments on Chapter 12 in my separate compilation of citation misrepresentatons. Let us just highlight here what is said on lines 753-755:
``Do not include homework, if given, as any part of grading. Homework is one of the most inequitable practices in education; its inclusion in grading adds stress to students and increases the chances of inequitable outcomes.''
As public policy guidance, this is extraordinarily irresponsible. Who would ever hire an accountant that never did math homework or drive a car designed by an engineer who never did math homework?
The other component of (ii) entails formative assessment, a means of evaluating student progress that has some merits for flagging misunderstanding early on. I am not objecting to some use of formative assessment, but this approach is acknowledged by experts as entailing significant new challenges upon teachers for implementation. The CMF is silent on that very practical issue.
Let us now discuss (iii). Chapter 12 omits all discussion of assessment for math course placement, and this omission is a major concern because in the time since the last CMF adoption in 2013 there has emerged a pattern of local school boards in California flagrantly violating existing state laws concerning such placement. The most visible example of this is SFUSD, and local Boards of Education in other states have also been engaging in such policies, following the lead of SFUSD. Hence, stopping the illegal policies in California should have effects more widely.
Passages in multiple chapters (such as 9, 10, 13) promote policies that violate the Math Placement Act. It is incumbent upon the CMF to clearly and visibly insist upon the obligations of local Boards of Education under two state laws discussed below. Since part of the illegal activity has relied on local boards creating their own placement exams that are intentionally excessively difficult (as illustrated below), the CMF has to recommend the use of third-party objective assessments when meeting the legal obligations outlined below.
The actions of SFUSD. Beginning in Fall 2014, SFUSD forced enrollment in Algebra I for the large number of 9th grade students who have already completed a UC-approved Algebra I course. This violates item (a) in Section 51228.2 of the California Education Code, which says districts
``shall not assign a pupil enrolled in any of grades 9 to 12, inclusive, in a school in the school district to a course that the pupil has previously completed and received a grade determined by the school district to be sufficient to satisfy the requirements and prerequisites for admission to the California public institutions of postsecondary education and the minimum requirements for receiving a diploma of graduation from high school established in this article,''
(the law allows exceptions, but those are not applicable when the parents do not approve).
More specifically, beginning in Fall 2016, SFUSD has required such students to pass an additional hurdle in order to avoid retaking a course on what they have already learned: its own Math Validation Test (MVT). This test is specifically designed to be much harder than the Algebra I course, ensuring that most who attempt it will fail. This policy of an additional hurdle would be illegal even if SFUSD's MVT were not excessively difficult, though the excessive difficulty violates yet another state law as discussed below.
[Curiously, those illegal policies based on the approach of holding back students with advanced knowledge are opposite the approach of UC area-C loophole which amounts to promoting the learning of less math to fulfill UC admission requirements. In both cases, the people violating policies and regulations wrap themselves in the banner of equity, as if claims to be pro-equity cannot be false.]
A relevant reference on this matter is the paper (LaMar et al., 2020) cited in Chapters 2 and 12 of the CMF. The title of that paper contains the phrase ``Derailing Impact of Content standards'' which directly opposes the fundamental principle of educational standards. The text of the paper confirms the illegal behavior by SFUSD: it says that opposition to the SFUSD policy came from ``parents of currently accelerated students''; this admits the forced holding back of students who were already accelerated, a violation of Section 51228.2 as discussed above.
A report by Families for San Francisco revealed the claimed benefits from the illegal SFUSD policy to be a mirage of lies. (The SFUSD policy would have been illegal even if the claimed benefits were not fraudulent.)
Elsewhere in the California Education Code is Section 51224.7, the Math Placement Act that explicitly requires
``a fair, objective, and transparent mathematics placement policy for pupils entering grade 9''
based on ``objective academic measures''. This forbids the use of formative assessment (that is subjective by definition, and hence prone to abuse by officials who oppose legally-mandated course placement laws), and the law mentions ``statewide mathematics assessments'' as an example of such a measure. Chapter 9, on lines 541-557, reminds districts of obligations of under this law but neglects to mention the crucial ``fair, objective, and transparent'' requirement noted above.
Section 1 of California Senate Bill SB-359 explicitly calls out these concerns in item (c)
``The most egregious examples of mathematics misplacement occur with successful pupils and, disproportionately, with successful pupils of color. These successful pupils are achieving a grade of “B” or better, or are testing at proficient or even advanced proficiency on state assessments.''
and two further items
``(i) California faces a looming shortage of college-educated workers in an increasingly competitive global economy.
(j) A policy for correct mathematics placement must be addressed in order to ensure a fair process and chance of success for all pupils.''
illustrate the significance of adherence to this law.
Those who claim to be champions of equity should put more effort and resources into helping all students to achieve real success in learning mathematics, rather than using illegal artificial barriers, misrepresented data and citations, or fake validations to create false optics of success.
Recommendation. Chapter 12 has to fully address math course placement, and clearly state that local Boards of Education have a legal responsibility to follow the two state laws discussed above. Assessment options suitable for discussion in Chapter 12 include:
the MDTP made freely available by UC/CSU,
the NWEA MAP assessment,
the iReady assessment,
the Smarter Balanced assessment.
A broader discussion of CDE-recommended diagnostic assessment is given here. The CMF also has to clearly state that in contrast, the MARS assessment (which is mentioned elsewhere in Chapter 12) is forbidden to be used for course placement purposes because it is based on formative assessment. That is subjective rather than objective, and hence violates the legal requirement for objective measures in the Math Placement Act.
Public Comment #6: Avoiding Acceleration via Wishful Thinking
This document consists of my comments on Chapters 9 of the second field review of the California Math Framework (CMF).
The title of Chapter 9 is ``Structuring school experiences for equity and engagement''. It may be unclear what the title means, but ultimately this chapter aims to offer ideas for accessing advanced math options near the end of high school -- really calculus -- without acceleration. A variety of ideas are suggested, but they’re all flawed (as will be explained). In the end, the CMF has no serious detailed plan to address it, despite more than 2 years to do so.
My detailed comments are provided in this linked document. Below I provide a version of the "executive summary" at the start of that document.
[...]
Executive Summary
The reason for the CMF's failed contortions at reaching calculus in high school without acceleration is that it is trying to square a circle: if the goal is for an ordinary student to reach calculus in 12th grade at a regular academic pace (no compressed courses, doubled-up courses, etc.) then they have to take Algebra I in 8th grade. That is reality, as has been emphasized by Bob Moses in connection with his Algebra Project and was also later noted by Adrian Mims in connection with his Calculus Project: ``If a student isn’t taking algebra in eighth grade, there’s no way for them to take calculus senior year.''
The CMF praises the efforts by Moses and Mims (see lines 466-473), but does not engage with the effort to help more kids be ready for Algebra I in 8th grade (its overstated false description of Common Core grade 8 on lines 517-523 doesn't count).
Highlights. To give a sense of the proposals to reach advanced math by the end of high school -- really calculus -- without acceleration, here are the last 3 suggestions in order (see lines 588-593):
the state of California convenes a panel of experts to figure it out;
schools use the block-schedule method with doubled-up classes and supplementary courses that has been a many-year logistical nightmare in SFUSD to manage its banishment of acceleration;
take data science.
There is a remarkable passage tucked away on lines 538-557: the CMF reminds districts in detail about their legal obligations under the Math Placement Act of 2015 (though it overlooks to remind them about a ``fair, objective, and transparent'' requirement under that law). In particular, on lines 554-555 the CMF notes that districts are required to offer
“clear and timely recourse for each pupil and his or her parent or legal guardian who questions the student’s placement.”
Both SFUSD and Palo Alto USD have been violating requirements of this law for many years, so it will be interesting to see if the CMF has an impact on their behavior.
Public Comment #7: Number Sense and K-8 Guidance
This document consists of my comments on Chapters 3, 6, and 7 in the second field review of the California Math Framework (CMF). I am making them available to the public here.
Chapter 3 is about number sense (mental arithmetic with small numbers, visualization of numerical relationships, building experience toward algebra, etc.), and Chapters 7 and 8 are about the content of math classes prior to high school: transitional kindergarten (TK) through to grade 5 is the focus of Chapter 7, and Grades 6-8 is the focus of Chapter 8.
My detailed comments are provided in this linked document. Below I provide a version of the "executive summary" at the start of that document.
[...]
Executive Summary
Introduction. The main concerns here are not about the math content (in contrast with high school, for which data science hype and related matters are a major concern). Instead, it is the low quality of many of the examples in Chapters 6 and 7, and the large amounts of space taken up in them by matters that have nothing to do with mathematics. (Chapter 3 could be improved in some ways, but it's garden-variety polishing.)
Much of what is in Chapters 6 and 7 is an embarrassment to professionalism. There are some bright spots, where it is clear that someone who understands math and its applications did the writing. But long passages meander around in ways that will be of little or no use to districts and publishers turning to the CMF for guidance. They'd be better off going to the California Common Core manual from 2013 to figure out what to do for K-8.
It doesn't make sense that after more than two years of development, what has emerged in Chapters 6 and 7 is at such a low level of quality (in exposition and utility). When reading these chapters, it often felt like reading a term paper by someone who scrambled to put it together on the evening before it was due.
Highlights
(i) In contrast with the 2013 CMF, the current CMF draft does not list the actual content standards from the 2013 California Common Core document in human language. Instead, the reader of the CMF is given long tables full of mysterious acronym abbreviations whose definition can only be determined by flipping through the 2013 California Common Core document (and in tables a very abbreviated list of highlight topics in English). This alphabet soup of codes makes an already difficult-to-use document even harder to read. It is undermining the utility of the content standards.
But to make matters worse, most of these acronyms are wrong in Chapter 6, due to stray letters added in all over the place (e.g., 2.MD.B.6 rather than 2.MD.6). I know where the stray letters come from, but that is beside the point: an ordinary teacher or random district official is not experienced in curriculum-writing and so won't know what to ignore in these codes when trying to decipher them. This permeates most tables in Chapter 6, as well as nearly every mention of such acronyms in the main text of Chapter 6. It occurs all over the place, on the following lines:
570-579, 631-638, 1345, 1347, 1555-1558, 1633, 1635, 1647, 1650, 1661, 1664, 1700, 1717, 1751-1753, 1770, 1784-1791, 1807, 1809, 1847, 1884, 1941, 1996, 1997, 2008, 2009, 2089, 2096, 2100, 2116, 2296, 2349, 2353, 2401, 2403, 2428, 2450, 2501, 2573-2580, 2611-2616, 2652, 2659, 2667, 2672, 2677, 2683, 2723, 2736, 2856, 2868, 2870, 2894.
When things are not written out in human language, the proliferation of so many discrepancies makes an unpleasant expository choice even more frustrating for users.
(ii) In multiple passages, the CMF denigrates math that isn't immediately applicable to one's local community or isn't data science. What about math that can be applied to future professions beyond one's immediate vicinity, or to outer space or other things that can excite kids' imaginations?
Writing denigrating such math can be found in lines 1406-1408, lines 3002-3007 of Chapter 6 and lines 511-521 in Chapter 7. Whatever author is responsible for such a myopic view of mathematics should never again be involved in the setting of public policy guidance on math education.
(iii) Many examples meander for a long time with little or no math content to justify the length, and others are so brief that the math content is never discussed in detail. In both circumstances, this is of no use to districts or publishers. The previous CMF always had succinct focused examples which were entirely focused on math content.
Consider the example on lines 1400-1513 of Chapter 6 under the title Habitat and Human Activity. This is not only devoid of mathematical content as presented, but in a large part of the middle it turns into an English literature activity (structuring cohesive texts, comprehension and analysis of written and spoken texts, etc.): see lines 1430-1461 of Chapter 6.
Later in this example, on line 1494 we are told that students ``designed a solution'' yet no problem has ever been stated that is being solved. Then on line 1498 it is said that students ``conveyed their ideas about the problems'' but we have never been told what the problems are. How does such chaotic writing occur, and furthermore persist at this stage with a team of 20 people doing oversight?
In Chapter 7 there is a vignette more than 10 pages long (lines 1110-1397) about different ways to rewrite 2L+2W (e.g., 2(L+W), L+W+L+W, etc.) yet the much richer example of clever ways to visualize the more advanced algebraic identity n2 - (n-2)2 = 4(n-1) is discussed in a single paragraph (lines 1093-1105).
Near the end of Chapter 7 is a grade-6 vignette (lines 1614-1691) in which kids cut solid clay into shapes. That could turn into an interesting study in solid geometry, but in the CMF details there is nothing with mathematical content (beyond noticing some unexpected face shapes). Instead, the end result is collecting data and uploading images into a computer. Where is the grade-6 math? Of what utility is this to someone teaching a math class when the math content is hardly discussed?
Recommendation: The CMF has to include full statements in English (not strings of acronyms) of the learning standards for all grades (and high school courses), as the 2013 CMF did. This is the only way to provide complete clarity about what concepts have to be taught. It holds districts accountable if they try to monkey around with the curriculum or play games with legal obligations related to course placement, and gives parents clarity about what their children should learn in a given year.
Content management and control for math courses will become extremely important as more people ``innovate'' with the curriculum (the data science experience is a warm-up of what is to come), leading to a proliferation of courses. There must be clarity for all stakeholders about what the grade 5 curriculum is, what ``Algebra II'' and ``Integrated 3'' exactly mean (e.g., that these courses cover certain aspects of logarithms and square roots), and so on. The recent phenomenon of the UC area-C loophole involving fraudulent validation of Algebra II cannot be permitted to re-appear in yet another form due to ambiguity about the curriculum.
The CMF needs to have clarity as a self-contained document, since this is what districts and publishers will look to when setting policy. Public education requires district accountability to external authority, and clarity in the CMF is essential for such accountability.
Public Comment #8: CSU and UC faculty committee comments on HS pathways
In mid-May, committees of faculty experts representing campuses of the California State University (CSU) and the University of California (UC) independently submitted letters as "public comment" to the revision process for the CMF. Both letters focus on high school math pathways. I briefly describe them here and provide links to each. The concerns they raise fit well with ones I have mentioned in my public comments, and also point out aspects I hadn't noticed.
On May 13, a "public comment" letter was submitted from a standing committee of the CSU Acadmic Senate consisting of disciplinary experts in mathematics and education. The main concern addressed therein is that the CMF treatment of math pathways is likely to damage diversity efforts in STEM education.
The end of the letter urges that alternative high school math pathways in the CMF should "serve to inspire, motivate, and prepare students for authentic access to the full spectrum of college level quantitative work", and says that in the CMF "strong caution should be provided that advanced courses such as data science, statistics, or financial algebra sufficiently deepen the understandings of precalculus and algebraic concepts as to not result in prematurely channeling high school students away from STEM or other algebra-intensive majors." The chair of that committee published an opinion piece that goes into more detail on this, and includes a link to that letter as well.
A "public comment" letter was submitted on May 16 from the UC area-C faculty workgroup. This workgroup consists of one faculty representative from each UC campus. It is a mixture of faculty (including some lecturers) in mathematics and statistics, and was formed in January 2022 to advise on the mathematics ("area C") requirements for admission to UC campuses in light of the proposed CMF.
The letter focuses on the CMF's proposed third high school math pathway, called MIC (Mathematics: Investigating and Connecting), a topic I discuss at length in my post #4. It independently arrives at many of the same concerns, and points out others that I had not noticed. It also ends with the same recommendation: the MIC high school pathway should be removed from the CMF. Permission has been given for that letter to be disseminated to the wider public now, so you can read it for yourself here.
Public Comment #9: Citation Gimmickry and the UC area-C loophole
The Mathematics Framework Third Field Review (hereafter called the CMF) is the outcome of an approximately year-long revision by State Board of Education staff and Region 15 Comprehensive Center at WestEd. This document contains my detailed comments on some concerns with this revised CMF. Below I provide a version of the "executive summary" at the start of that document.
Given that the CMF is going to be influencing math education in this country for the next decade, it is unacceptable that the State Board of Education is providing such an extremely short time period (including a weekend followed by a federal holiday) to review the Framework. Critical concerns remain, and the CMF does not live up to the standards of a document that sets state-wide education policy.
Citation misrepresentation persists. Despite objections from more than 440 STEM faculty from across the state, guidelines have still not been developed for data science to be in alignment with math education content standards. The document also has critical inconsistencies that open up the possibility for public schools to implement the CMF in contradictory ways. Finally, the CMF still invokes a UC policy on data science courses substituting for Algebra II that has been challenged at multiple levels, including its recent outright rejection by the entire California State University system. All of the above are critical shortcomings, due to which the CMF cannot be approved in its current form.
The previous CMF draft had an enormous amount of citation misrepresentation (i.e., false or misleading descriptions of citations from other literature). The revision process removed many of the misrepresentations I had flagged, but some misrepresentations have not been corrected, and some corrections have been so cosmetic as to not fix the problem at all. I also found more cases of citation misrepresentation which I had not previously noticed (working by myself). It also cites unpublished papers with design flaws. How did the revision process fail to fix all of this? Why does the CMF still not adhere to the level of research quality from the What Works Clearinghouse?
Furthermore, the CMF still embraces the UC area-C loophole for allowing Algebra II to be replaced with courses (such as in data science) that significantly fail to cover the required content standards. My public comments on Background and Chapter 8 for the prior CMF gave extensive information warning about its deeply problematic nature, and the open letter signed by more than 440 California higher education faculty has warned about this problem too.
Since I raised the alarm, the concern has been further magnified by the California State University (CSU) system. The abdication of quality control through this loophole is so outrageous that in January 2023 a resolution to the CSU Academic Senate (see page 2 of this document) red-flagged that due to UC's policy,
``school districts have begun submitting coursework alternatives to [Algebra II and Integrated Math 3] that do not address the range of standards expected for college and career readiness ... This in turn threatens to increase the number of students entering the CSU who are identified as needing extra support to succeed ... CSU must do whatever it can to ensure that A--G college prepatory coursework properly prepares students to attend both the CSU and UC.''
The same resolution also formally discontinues that UC policy at CSU. The State Board of Education has a responsibility to research questionable practices proposed for inclusion in state policy. What research did the SBE complete to overcome the objections that have been raised since long ago by college and university content experts from all over California, and more recently by the entire CSU system?
The public expects those hired at taxpayer expense to revise the CMF to get it right. This is education policy that will impact millions of children in California, and likely the entire country. It must be held to the highest standards of quality. If there is an oversight or mistake in what I have written then please tell me and I'll gladly fix it.
Public Comment #10: Algebra Guidance Inconsistencies and Data Hype
When the California State Board of Education (SBE) unanimously approved version #3 of the California Math Framework (CMF), it was noted as part of the vote that
``if we do have places in the Framework that are not aligned with each other, we can ask the CDE [California Department of Education] and the State Board staff as part of the technical edits to be sure that clarity and alignment is there.''
In other words, the SBE made a commitment to making the necessary edits to remove internal inconsistencies within the CMF. Consistency with the state-approved standards also requires that the CMF supports the key point made by Ji Son and Jim Stigler in their article: data science should be a great source of motivating examples and applications for the math curriculum rather than be set up in opposition to it.
This document lists the places where the SBE-approved version of the CMF has such inconsistencies (which therefore have to be fixed) on two issues: (i) the value of increasing readiness for 8th-grade Algebra I or Mathematics I, and (ii) hype about data science and its interaction with Algebra II/Mathematics III. Below I provide the ``executive summary'' at the start of that document.
Algebra I or Mathematics I in eighth grade. The CMF needs to state clearly in its discussion of Algebra I and Mathematics I that it endorses the notion that all students who are ready for rigorous high school mathematics in eighth grade should take such courses (Algebra I or Mathematics I), and that all middle schools should offer this opportunity to their students.
Indeed, the CMF is ``a guide to implementing the California Common Core State Standards for Mathematics (CA CCSSM or the Standards), adopted in 2010 and updated in 2013'' (as stated on lines 37-39 of Chapter 1 of the CMF), page 80 of Appendix A to the national Common Core Standards for Mathematics (on high school) includes the above exact endorsement about Algebra I in middle school, and pages 831-2 of Appendix D to the CA CCSSM references that Appendix A as part of its guidance on middle school course placement options.
Aligning more clearly with the value of increasing readiness for 8th-grade Algebra I would also align the CMF's equity goals with programs having a demonstrated substantial track record of excellence in demographic outcomes for advanced high school math (which in turn is the key to readiness for earning 4-year college degrees in the fields offering the most reliable path to future job security and social mobility: STEM fields, economics, business, data science, statistics, etc.).
Examples of such programs include Ab7G at Purdue University, Adrian Mims' Calculus Project, Bob Moses' Algebra Project (whose origins were in providing readiness for 8th grade Algebra I to students in under-resourced Boston public schools), BEAM, and Ile Omode's middle school math program (partnership with a local community college). A common theme of these programs is a focused effort on helping more kids from under-resourced schools to be ready for 8th-grade Algebra I. (Providing additional on-ramps, such as via parental choice, multiple support resources, or an opt-out program as in Dallas, etc. is valuable too.)
In contrast, the most prominent program which took the opposite approach of blocking everyone from 8th grade Algebra I, namely SFUSD, was a complete failure in achieving its equity goals (as confirmed by a report from Stanford's Graduate School of Education). Even SFUSD's own superintendent has acknowledged (see this video, at 2:20:25) that after 10 years, their approach to math ``is not working'' and is reversing their policy.
Data Science and later high school math. The president of the NCTM has aptly said,
``Data science doesn't have to be this brand new thing. Take what you would normally do in Algebra 2 and normally do in Integrated III, and infuse it with data.''
It is essential that the CMF maintains very clear messaging that is consistent with the California Common Core State Standards for Mathematics: a high school data science course can be perfectly fine when taken as an elective, but it is not an alternative to the Algebra II curriculum for anyone who wants the preparation for a 4-year college quantitative college degree: STEM, economics, data science, business, statistics, etc.
Those quantitative college degrees and careers offer the most job security and social mobility for the future, and hence are essential for providing true equity in future higher education (which is what matters to current high school students who will go to college). The overwhelming consensus from experts in those degree areas concerning mathematical prerequisites matters a lot to parents, and the CMF has to be attentive to that.
The CMF's remaining advocacy for data science courses as an alternative to the Algebra II curriculum (rather than as an elective, much like high school economics) leaves students substantially underprepared for any 4-year STEM major in college, including data science, computer science, statistics, and engineering, as well as 4-year degrees in economics and business. This has been affirmed in unambiguous terms by: a May 2022 letter to BOARS and UC Office of the President from a significant majority of black UC faculty in fields related to data science, a recent resolution from the CSU Academic Senate to discontinue such validation for CSU (on the grounds that it is ``inadequate for college and career readiness''), by the recent unanimous July 7 BOARS vote opposing the validation of Algebra II by data science (confirmed by the official BOARS July 17 meeting minutes), and an open letter signed by nearly 450 California college and university experts across all quantitative fields (including economics, business, and political science).
The CMF also still promotes the false message that data science is better-suited to good teaching. The need to improve K-12 math pedagogy has no bearing on the relevance of content standards. The claim that the curriculum, which remains unchanged from 2013,
``offers little practical utility in the 21st century'' is seriously misinformed, affirmed above by the quantitative faculty experts across the CSU and UC systems and well-known to anyone familiar with how modern technology and real data science actually work.
Moreover, items from the curriculum cherry-picked in isolation to create an impression of obsolescence can only indicate where pedagogy needs more contemporary motivation and delivery, nothing else. For instance, polynomial division is a strawman example yet such algebra underlies QR codes and securely sharing secrets over computer networks. (The extent to which material is explored manually or with technology is not a basis for determining if the underlying mathematics has contemporary relevance.)
Public Comment #11: Citation Misrepresentation and other Ongoing Concerns (October '23 revision)
At the time of the July 12, 2023 unanimous vote by the California State Board of Education (SBE) to approve version #3 of the California Mathematics Framework (CMF), the following statement was made:
``If we do have places in the Framework that are not aligned with each other, we can ask the CDE (California Dept. of Educaton) and the State Board staff as part of the technical edits to be sure that clarity and alignment is there.''
The SBE thereby committed to make edits to remove internal inconsistencies in the CMF.
A revised CMF with the ``technical edits'' was posted on the CDE website for the CMF around 3 months later, on October 19, with new chapter and appendix files replacing the Board-approved mid-July version.
This was posted without public announcement: the box entitled ``Recently Posted in Mathematics'' on that webpage still (as of today, November 1) says ``No items posted in the last 60 days''. The top of the first page of each file is dated July 12, 2023 (replacing ``July 12-13, 2023'' on the Board-approved version).
The existence of this new version was announced to a specific CDE mailing list (to which I don't belong) on the afternoon of October 19, with the following message:
``At its meeting on July 12, 2023, the State Board of Education adopted the new Mathematics Framework. A revised version that includes the edits requested by the Board as part of its action is posted at https://www.cde.ca.gov/ci/ma/cf/.
A professionally edited and designed PDF version of the framework is in development and will likely be available in 2024. The content of the PDF will not substantially change from the version that is currently posted. There are currently no plans to produce a hard-copy edition.''
The end of each of the new chapter and appendix files now says ``October 2023'' (where it used to say ``June 2023''). To the SBE's credit, the ``technical edits'' have been responsive to public comment by significantly scaling back the advocacy against middle school Algebra I that was making the CMF internally inconsistent.
But internal inconsistencies still remain in its discussion of middle school Algebra I, so the SBE has not yet achieved ``clarity and alignment''. Much as the CMF recognizes the difficulites to reach calculus in high school without Algebra I in middle school (see lines 1311-1313 of Chapter 8), it needs to be very clear about the value of offering Algebra I in middle school and providing Algebra I readiness to more middle school students; that would properly honor the legacy of Bob Moses, whose book Radical Equations is quoted early in Chapter 1. Even SFUSD, long the poster child for blocking middle school Algebra I, has seen the light.
Another persistent problem is citation misrepresentation (citations having nothing to do with the topic of discussion or not meaningfully supporting the claims being made). The CMF revision process removed much of what I found in that direction in draft #2, but it has (i) introduced new citation misrepresentations (generally in the service of presenting ideology as if it is evidence-backed research), and (ii) not fixed others. I warned the SBE more than a year ago that all citations must be checked for accuracy, since the CMF has no more credibility on this front; the warning was not heeded.
Amazingly, in some paragraphs the latest CMF revision introduced multiple new citations that are citation misrepresentation. The SBE owes it to the public to determine exactly who is responsible for this persistent citation misrepresentation problem and to ensure that they will never again have a role in the setting of public policy for education in California.
Hype and misinformation centered around data science also persist (primarily in Chapters 5 and 8). The CDE ignored the advice to appoint content experts to rewrite its treatment of data science, and it must remove that hype to not mislead the public as promoters of widely-taught data science courses have been doing for years. In particular, the implicit message that ``data science'' has magical features for presenting math in a compelling and contemporary way is myopic. The failure of Chapter 13 to compel publishers to give teachers a better array of genuine application contexts for motivation is a huge lost opportunity.
Most of these remaining problems, which I have recorded in this linked document, can be fixed by the delete button. Internal inconsistencies and dishonesty in citation choices are unacceptable in public policy documents, so these corrections must be made.
#12 (May 5, 2024 Update) A Strange Complaint
A formal complaint has been submitted against me with Stanford. But this complaint doesn't stand up to even the most basic scrutiny; it is a product of surprisingly sloppy work and a lack of attention to the most basic details and conceptions of evidence. In this post I explain where it comes from, and then I turn to analyzing its many flaws (Example 3 in Section IV below is perhaps the most astounding).
I. Introduction. On April 13, 2024, a long article with a grandiose title appeared on the website Medium that attacked my integrity with a series of lies and made an array of demonstrably false claims. The initial paragraphs amount to a profile of me bearing no relation to reality (and correspondingly, no evidence is provided or exists for anything that is claimed there). The rest of that article is an error-filled discussion co-authored by 9 people offering a rebuttal to 11 of my many public comment submissions about citation misrepresentation in drafts of the California Math Framework (CMF), and to one other clarification suggestion I made.
The article makes the entirely false assertion (without even a shred of evidence, none of which exists) that I have engaged in a ``tirade of harassment'' against Prof. Jo Boaler, one of the CMF co-authors. It also falsely insinuates via a hyperlink to this Stanford Daily article that I am connected to a recent anonymous complaint filed with Stanford University. I have never had any direct or indirect involvement with that anonymous complaint, and was very surprised when I first heard about its existence (after it had been submitted).
Not only have I never attacked anyone on the CMF writing team, but I was thanked by the State Board for my contributions to the revision of the CMF, and was an invited author for specific part of it: I was asked by the State Board of Education to co-author a skills-focused guide to the high school math curriculum that ultimately became Appendix A in the final draft of the CMF. Prof. Boaler even graciously thanked me and the other co-authors for that contribution.
As anyone can check for themself, in my written and spoken comments about the CMF I have always focused on the content of the document (and its citations), never singling out a specific CMF co-author as responsible for concerns in the document (let alone engage in a ``tirade of harassment''). My focus in these matters has always been on accuracy, math, its connection to other fields, and its relevance for students. Anyone is welcome to review my appearances on television and in print to judge for themselves about the focus of what I say.
It is stated in post #2 on my CMF website that ``since I am one person duplicating on my own the work of a 20-person oversight team [called the CFCC] I know mistakes are possible; if you find any then please tell me and I'll gladly fix it''. Early on I made corrections on my CMF website in response to some errors brought to my attention. I was puzzled why the new errata claims in the Medium article about 11 of my citation misrepresentation items (plus 1 other clarification suggestion) were not brought to my attention directly. The first several of these purported errata that I then looked into were full of mistakes that contradicted the claims being made; such sloppiness did not merit further consideration.
II. The Complaint. On May 3, 2024 I was told by the Stanford Daily that a formal complaint had been submitted to Stanford University against me by the President of the Central Section of the California Math Council (CMC). It asserts that I have ``shown a reckless disregard for academic integrity'' in efforts related to the CMF.
Later I looked back more closely at the Medium article and noticed that the text of the formal complaint is nearly identical to the portion of that article authored by the 9 people, including President of the CMC Central Section who is the sole signer of the formal complaint. It would be worthwhile to find out (i) why the other 8 didn't also sign this document that is nearly the same as what they had co-authored earlier, and (ii) who actually wrote this complaint.
The submitted complaint is everywhere dense with factual inaccuracies, so it scarcely merits a response. But to make clear how devoid of accuracy it is, I refute its points below. Since the complaint consists of two essentially distinct parts -- an attack on my integrity, and a rebuttal to 11 selected items among the many ``citation misrepresentations'' that I identified in CMF drafts (along with one other suggestion of mine) -- I address these in the separate Sections III and IV below.
Let me emphasize here a spectacular failure of scholarship (discussed more in Section IV) underlying the complaint's objections to 11 among the citation misrepresentation items I identified. In the 9 cases corresponding to my comments on CMF draft 2, the revision process either removed the item from the final version or incorporated my input. In the other two cases, which are citation misrepresentation concerns in the final version, the objections also do not hold water. Given the complaint's high praise of the organization that implemented the revisions, this dismantles its own case.
Even more amazingly, for some of those 9 items corresponding to CMF draft 2, the complaint's objection is that my description of a citation is totally wrong yet its own description is what is totally wrong: the basis for its objection is a description from the final CMF. So of course what the complaint spells out differs from my public comment on CMF draft 2: it was looking at a later version of the CMF, after my input had been incorporated!
Whomever carried out the analysis of the 11 citation misrepresentation examples in the complaint has such a poor grasp of details and low level of scholarship that they should never again be involved in setting public policy for education.
III. The Integrity Claims
Let us now go through the complaints lobbed against me in the main text, broken into 11 parts, and see how they all fail. (I can say much more, but what is below seems amply sufficient.)
1. It is said that I have eschewed academic discourse and instead have ``merely posted ad hominem attacks on Dr. Boaler and the California Mathematics Framework (CMF) via his Google Site. In his attacks on the CMF, he has used deeply flawed reasoning about research, showing a lack of understanding of K-12 education.''
Firstly, nowhere on my CMF website (which I made for public comment submissions on CMF) do I ever mention Prof. Boaler or pin my CMF concerns as the fault of a specific person. I never mentioned any of her cited papers in my CMF website post #2 about citation misrepresentation. My goal has always been, and still is, to focus on substantive priorities (mathematical curricular content and how it is organized and used).
My involvement in all of this has been motivated by the purpose of making the CMF more accurate. After I submitted my public comments on CMF draft 2, significant revisions were made in accordance with my suggestions. Every citizen of California has the right to submit public comment on the CMF drafts; especially as a math professor, I took my civic duty seriously and followed the submission procedures set by the California Department of Education. Identifying corrections in a carefully detailed manner during a period of public comment is not an ``ad hominem attack''; it is a public service. In fact, it took considerable effort and hundreds of hours of my personal time to contribute.
The mention of ``academic discourse'' here is a strawman; my involvement in these matters is for the purposes of public policy, not academic debate. As for the claim that I have used ``deeply flawed reasoning about research'', it is worth noting that the many revisions of CMF draft 2 made by WestEd incorporated a lot of my suggested corrections and changes, so WestEd saw value in my feedback.
As the Director of Undergraduate Studies in Math at Stanford for more than 10 years, and overseer for a large number of successful curricular revisions here (which have led to university-level awards), the idea that I lack a sufficient understanding of K-12 math education to be involved in the public discussion about it (focused often at the high school level) is ludicrous. A key part of my DUS responsibility is to oversee university efforts that help students to bridge the transition from high school into a wide range of quantitative college majors. My shared concerns with 440+ higher education colleagues in California having expertise across many quantitative fields have had a positive impact on the final CMF and related developments around the country.
2. ``... he has completely discredited himself by failing to acknowledge the changes made in subsequent CMF drafts and the fact that WestEd has vetted all citations and verified their accuracy.''
False. My CMF website posts #2 through #8 were all about CMF version 2, and in the very next post (#9) about CMF version 3 I did acknowledge the changes due to revision. Indeed, I wrote ``The revision process removed many of the misrepresentations I had flagged, but some misrepresentations have not been corrected, and some corrections have been so cosmetic as to not fix the problem at all. I also found more cases of citation misrepresentation which I had not previously noticed (working by myself).''
I pointed out in detail some places where the vetting process had made errors so that these errors could be fixed in the next version. Just as I acknowledged in my own post #2 that I welcome corrections to errors I have made, the fact that WestEd went through all the citations in the CMF doesn't mean they didn't make errors either.
We don't owe blind faith to WestEd; public education policy should be transparent, and as citizens we can check things for ourselves too. Where I saw that errors remained and/or new misrepresented citations had been introduced, it is responsible citizenship that I point out these out in a careful and detailed manner as public comment so they can be fixed. The idea that I am ``completely discredited'' by any of this completely discredits whomever makes such statements.
3. ``...Moreover, he has gone beyond critiquing the research and ventured into stochastic terrorism through indirect and vague attacks on Professor Jo Boaler's work which has led the public to myopically targeting Dr. Boaler rather than the entire CMF writing team.''
This statement is entirely false. The fact that the public has more awareness of one CMF writer over the others has nothing to do with me. It long predates my involvement in these matters, much as widespread public concern about early versions of the CMF predates my involvement (as multiple high-visibility documents spelled out).
Whenever I have been asked for commentary about a specific CMF author, I have always declined. Furthermore, in my submitted public comment on citation misrepresentation I never mentioned the CMF citations to published work by Prof. Boaler. In the absence of any evidence at all, this claim about vague attacks is ironically itself a vague attack.
4. ``Conrad's repeated efforts to attack previous versions of the framework, completely ignoring the CMF has been updated and unanimously approved, is disrespectful to Dr. Boaler who deserves respect for her participation on the CMF revision writing team.''
First of all, the statement that I ``completely ignored'' any such thing is factually inaccurate. Posts #10 and #11 on my CMF website explicitly state at the outset that the CMF was unanimously approved. The purpose of those posts was to identify more places where corrections are needed. The mere fact that a document received unanimous approval does not ensure it is completely accurate.
The State Board openly invited additional feedback after their unanimous vote, and I was thanked by the Board for the further corrections that I submitted in response to that request. My efforts had nothing to do with respect or not toward anyone on the writing team.
5. ``It is Conrad's responsibility, as an academic, to correct the narrative. Unfortunately, he is still trying to discredit the framework as seen in his participation in the SB1411 hearing, in which he provided testimony on one of three bills in the math excellence package presented by Senator Ochoa Bough [sic].''
This is another entirely factually inaccurate statement. My participation in the hearing for Senate Bill 1411 is a matter of public record and can be reviewed by anyone who is interested in the facts. (See this video, beginning at 1:47:49; I am the second person to testify there.) That bill expands the involvement of content experts from across public higher education in the development of future curricular frameworks in all fields. It has been officially supported by the Academic Senate of Cal State and unanimously passed the Education subcommittee of the state Senate.
My invited testimony to that subcommittee preceding its unanimous vote focused on the value of such expanded communication between K-12 and higher education (noting its alignment with a joint position statement from the National Council of Teachers of Mathematics and Mathematical Association of America), and on preparing California's students for jobs of the future. It made no reference whatsoever to the CMF, as anyone can read for themself, so the suggestion that my testimony was part of a (non-existent) ongoing effort by me to ``discredit the CMF'' is incompatible with reality.
Moreover, as an academic with content expertise in math, it is responsible citizenship for me to point out errors in public policy drafts related to math where I see them, in a carefully documented way during appropriate time periods so that corrections can be made. I did that more thoroughly than any other person in the state of California. There is no ``narrative'' to be corrected.
6. My CMF website that shared my public comment submissions with the wider public is described as ``A private website where he shares his Stanford position and title, suggesting that Stanford is opposed to the framework''
First, there is nothing private about my CMF website; it is world-viewable. All of my CMF comments on this website were submitted as public comment in a timely manner to the California Department of Education.
Second, on a personal website it is entirely standard to mention one's professional affiliation. My professional position and title (including being Director of Undergraduate Studies in Math) have obvious relevance to the context for my feedback on the CMF. No implicit message about an institutional view on the CMF is conveyed through the use of that website.
When an item about the CMF was posted on the Stanford-owned math department website, it was stated clearly at the end that Stanford takes no position on the CMF. There is a webpage hosted by Stanford's Graduate School of Education that links to many opinion pieces related to the CMF, with no caveat about an institutional view on the CMF. The latter seems not to be a cause for concern to the complainant, so this accusation is baseless.
7. It is said that I have ``deliberately engaged in attempts to paint Prof. Boaler as solely responsible for the production of the CMF and its ideas within, taking his willful place in an ecosystem of anonymous complaint filers, an uncivil and threatening Twitter mob, rival attention-seeking professors, and political media.'' (the hyperlink as in the original points to an article about an anonymous complaint filed with Stanford).
This is disconnected from reality in numerous ways: (i) as I have already mentioned, I had no direct or indirect involvement with that complaint, (ii) I never knew about its existence before it was submitted, (iii) I have no social media presence.
The follow-up accusation ``This unprofessional behavior, causing a hostile work environment for another professor and -- more than that -- a hostile world environment, directly contravenes Stanford's policies stating that faculty should work in a safe environment free from hostility. Further, he is preventing Dr. Boaler's ability to conduct research, one of the fundamental rights of a Stanford faculty member.'' is similarly lacking any evidentiary connection to me (and no such evidence exists). Such a formal accusation of unprofessional behavior is so bereft of evidence, and transparently false, that it is arguably itself unprofessional behavior.
8. ``Dr. Conrad is positioning himself as an expert in mathematics education, even though it is my understanding he has limited experience teaching in a TK-12 classroom. The following examples shed light on his reasons for opposing the CMF, yet his perspective favors only a minority of the state's student population. In particular, his critiques highlight research that positions all students as capable of accessing rigorous mathematics, including multilingual learners and students with disabilities. Notably, Conrad aligns with the past beliefs of individuals within Stanford's mathematics department, historically opposed to mathematics reforms.''
I have never ``positioned'' myself as anything other than what I am: a professional mathematician with a lot of experience speaking to colleagues in many quantitative fields and with many years of involvement in successful curricular revisions to help students from all levels of preparation to achieve success in university-level quantitative coursework. This is clearly said in the Background section at the top of my CMF website.
Since I have been at the forefront of numerous successful extensive revisions of introductory Stanford math courses that focused on making the content and pedagogy more engaging and contemporary, the suggestion that I am ``opposed to mathematics reforms'' is absurd.
The claim about my ``perspective'' makes no sense. What ``perspective''? My opposition to replacing substantive mathematical content with math-lite content promoted under false promises is rooted in concerns about off-ramping from quantitative opportunities for all students, focused on the jobs of the future. This opposition is shared with very many content experts from across California higher education (and we all support connecting data science to substantive mathematical content; I have never said or written otherwise).
Furthermore, what does it mean to ``oppose'' the CMF? I persisted in submitting detailed and precise public comment to make the CMF more accurate; that isn't ``opposition''. My public comment submissions on the CMF addressed a wide swath of its content themes, not limited to the few listed in the quoted accusation, as anyone can check for themself by looking at my CMF website.
Finally, the claim that I am ``aligned'' with something from the past in Stanford's math department (apparently the 1990's) is entirely false. I have never had communication about K-12 math education with the people at Stanford from that time (and when offered to have such a communication I have declined). I have paid essentially no attention to what was done back then, since it is immaterial to the substantive content of my feedback on the CMF and related matters.
9. ``Most of the critiques on Conrad's website apply only to an old, first draft of the framework, not the version that was adopted by the state in July 2023.''
My public comment submissions were primarily on the second draft, not the first draft. (The only submission I made during revision of the first draft was a document I co-authored at the request of the State Board of Education.) Most of my public comment is on the second draft rather than on the later drafts since a full 60-day period was provided for public comment on the 1000-page second draft whereas the third draft gave the public only 11 days (which included the July 4 long weekend) to provide public comment. Moreover, it is a sign of improvement if there aren't as many concerns for a later draft (though in this case the much shorter time window for public comment limited the scope for feedback).
Earlier CMF drafts disappeared from the CA Department of Education website, but I have kept copies of all of them due to the work I put into this. I make it very clear at the top of my CMF website which CMF draft each post is about, and I am happy to share copies of the earlier CMF drafts with anyone who asks for them. Copies of them all can also be found here.
10. ``The work of the Math CFCC subcommittee falls under the purview of the Bagley-Keene Open Meeting Act. This directive states that all the subcommittee meetings were open to the public, and each meeting provided an opportunity for public comment. The Math CFCC members carefully reviewed both oral and written public comment when submitting ideas for review. Prof Conrad did not submit written suggestions, nor engage in the statutory process during the proceedings.''
This accusation is downright bizarre. It's not my job to manage subcommittee-level details, and more importantly my involvement with the CMF only began in summer 2021 yet the CFCC meetings took place in August 2020. The insinuation that I did something wrong by not engaging with the ``statutory process during the proceedings'' for the CFCC public sessions way back in 2020 is thereby completely ridiculous, even setting aside the ongoing pandemic during that time.
11. ``Conrad approached his review of the framework with an agenda: a number of his critiques target research showing the potential of all students to learn, and research sharing schools relating mathematics to students' cultures.''
This is balderdash. My sole focus in these matters has been to (i) promote wider awareness about math content, and (ii) root out inaccuracy where I find it, providing supporting details for corrections to be made. These concerns touched upon the full swath of the CMF, not prioritizing specific areas over others; the selection of two specific areas in this quoted accusation is thereby cherry-picking.
I have never disagreed with ``the potential of all students to learn'' or opposed presenting math in a way that is relevant to students lives. Never. For example, in my public comment on Chapter 13 of the CMF version 2 (about K-8 textbook approval criteria) I urged revision to ``explicitly emphasize that publishers are expected to provide engaging contemporary motivational contexts for the introduction of topics in their material.'' This was urged to make math accessible to all students.
IV. The Citation Misrepresentation Claims
The complaint selects 11 instances of citation misrepresentation that I documented on my CMF website, along with one instance of a clarification suggestion, and raises objections to each of them.
Let me first address the objection to my clarification suggestion. I suggested in my public comment on Chapter 9 of CMF draft 2 that the remarkable-sounding case of a student diagnosed in youth with a low IQ who then went on to earn a PhD in applied math should mention the fact (which I found by looking into the cited literature) that the student had dyslexia. Indeed, that can readily explain a low-IQ misdiagnosis, providing highly relevant context (in my opinion). The CMF revision chose not to follow this suggestion; fair enough.
The complaint argues that the purpose of this example is to illustrate that ``students can overcome barriers to excel at mathematics'', and it says that omitting the mention of dyslexia ``does not diminish this important message''. I never said the possibility of overcoming barriers should be devalued, but what is the specific barrier being overcome here? Is it low IQ, or dyslexia? I was arguing for more information to provide relevant context. The reader can decide for themself if the fact of dyslexia is relevant to evaluating what really happened in this case, but it is clearly a standard example of disagreement on what information is pertinent. Including this clarification suggestion for the CMF as part of a claim of academic misconduct makes no sense.
Let us now focus on the complaint's 11 selected cases among the many I identified as citation misrepresentation: 9 for CMF draft 2 and two for the final CMF. Before diving into its objections, the complaint summarizes the situation by saying:
``In all cases the critiques given by Conrad are weak, irrelevant, or reflect a lack of understanding of research in education. Conrad has attempted, many times over, to wield his weak critiques to stop the mathematics framework in California. Those attempts have obviously failed, and he is now moving to halt reforms under consideration in the rest of the country, sharing the same set of claims that the California framework used flawed research.''
The absurdity of that passage is reflected in the fact that all 9 examples of citation misrepresentation for CMF draft 2 analyzed in the complaint were either removed from the final version or revised by WestEd to incorporate my suggestions. Since the complaint refers to the revision work of WestEd (which handled all revisions on draft 2 onwards) as ``rigorous'', WestEd's use of my input in all 9 cases contradicts the objections.
Even more remarkably, in some cases the complaint objects that my public comment doesn't match a description of the citation in the CMF yet the complaint is referring to the wrong version of the CMF: the final version rather than draft 2 that I was writing about (see Example 3 below for a sample)! So the complaint is objecting to its own confusion. This is ridiculous.
Here are four examples that illustrate the array of blunders in these 9 instances (there is no point in going through them all here; similarly inaccurate scholarship pervades everything):
Example 1: In CMF draft 2, there is a reference to a 2015 paper labeled as "Menon et al." and its analysis of math learning. There were two papers by Menon from 2015 in the bibliography of draft 2: one by Menon alone that never mentions math, and one by Menon and others.
Obviously the CMF had to be referring to the latter paper, since "et al" refers to there being multiple authors. More substantively, the solo paper by Menon makes no mention of math (or arithmetic), as anyone can check. I felt that the multi-authored paper was being badly misrepresented, and explained this at length in my submitted public comment (and noted the need to clarify which of Menon's two papers in 2015 was being cited).
The complaint says I am sloppy for that citation misrepresentation claim: it insists what was intended in the CMF was the solo paper by Menon to which I wasn't referring. But that is obviously not true, for the two reasons I explained above. How can the complaint argue that a paper with no mention of math is the intended reference for a CMF discussion of math learning? WestEd's revisions even removed Menon's solo paper from the bibliography of the CMF. The complaint's objection to this citation misrepresentation concern is total confusion by the author of the complaint.
Example 2: The paper of Menon et al. to which I refer in Example 1 is still cited in the final version of the CMF, and in my public comments on the final version of the CMF I readily acknowledged that the description of this paper's results (on tutoring intervention for children with learning disabilities in math) had removed some earlier misrepresentations. However, I explained that the paper's results were still being described in a manner that could be misunderstood by a typical reader (e.g., district staff, teacher, or parent), due to the limited size of both the study and scope of skills explored. I recommend clarifications, so readers wouldn't come away with more dramatic conclusions than could be justified by the paper's context.
The complaint raises two invalid objections. Firstly, it insists that ``Educators are fully aware that studies show a particular case of achievement.'' But it is completely implausible that a typical reader is going to get a copy of the paper and read it; they are going to base all impressions on the description within the CMF, which did not convey a sense of the study's limited size and scope. Perhaps an education researcher knows the need to go back to the original paper, but the CMF is written for use by district staff and parents, not education researchers.
Secondly, the complaint insists that my concern about this potential for misunderstanding reveals something about ulterior motives: it says that my ``continued push to discredit evidence that shows the improvement of students with special needs, seems to be revealing his motives for trying to discredit the framework.'' This is beyond crazy. I was never ``discrediting any evidence''; to the contrary, I was insisting on more clarity about the actual scope of the conclusions of the cited paper.
How is the author of the complaint divining that I oppose the idea that students with special needs can improve their learning of math? I have never expressed such a view anywhere, since I hold no such view. This objection to my citation misrepresentation concern is a delusion by the author of the complaint.
Example 3: The CMF draft 2 referred in Figure 5.11 to some data from a paper of Clark et al. in a manner that was not actually illustrating the point being made in the CMF. I explained this at length in my submitted concern, and said that a different example is needed. WestEd addressed this by removing the reference to the paper of Clark et al.
The complaint insists that I described Figure 5.11 rather incorrectly, and that I overlooked that its focus is on sample size. But that is totally wrong, since what it describes as Figure 5.11 is from the final version of the CMF, which has nothing to do with Figure 5.11 from CMF version 2 to which I was referring. Indeed, the final CMF has Figure 5.11 about oceanographic data whereas CMF draft 2 has Figure 5.11 about air pollution in cities from a paper of Clark et al. So the complaint's objection to my concern reflects its author's error of reading the wrong version of a document. I don't need to say more on such sloppiness.
Example 4: The CMF draft 2 discussed the 1892 report of the "Committee of Ten" which proposed a two-pathway math curriculum for high schools. It said, as the complaint against me notes, that ``The traditional sequence of high school courses -- Algebra, Geometry, Algebra 2 -- was standardized in the United States following the “Committee of Ten” reports in the 1890's. The course sequence -- which was primarily designed to give students a foundation for calculus -- has seen little change since the Space Race in the 1960's''.
I raised the concern that this passage can appear to be saying that the traditional course sequence, and consequently the focus on a path to calculus, goes back to the 1890's (which is not actually true; the history is more nuanced, as my submission explained). I urged the removal of this topic due to a widespread narrative I saw in the media (that I branded "the myth of 1892") declaring the usual math course sequence is an obsolete relic of the very distant past. Even a CFCC member was confused about this, saying during a CFCC meeting that the usual high school math content is an ``antiquated pathway'' that ``was developed for the needs of the 1800's, the early 20th century'': see this video.
That is why I was concerned: if even CFCC members have been misled about the history concerning the Committee of Ten, surely the public will get confused. The complaint insists ``there is nothing inaccurate'' in the CMF passage, but it is completely missing my point. WestEd, whose revision work the complaint repeatedly praises, did what I recommended: all mention of the Committee of Ten is gone from the final CMF. There is no issue here.
V. Conclusion
The public comment submissions I made for CMF drafts led to complete or partial fixes to a wide array of concerns. Moreover, I was invited by the State Board of Education to co-author a guide to high school math skills, regarded as a necessary contribution for clarity on high school preparation to pursue quantitative college degrees. This became Appendix A of the final version, and I was thanked multiple times by the State Board of Education for my contributions to the CMF development process. It is an accomplishment that is public service.
Parents and college faculty around the country have contacted and thanked me for sharing information based on my experience and expertise. I will continue to do so, as a recognized source of reliable guidance on readiness for quantitative college degrees and career paths.
The formal complaint submitted against me is permeated with negligence and sloppiness, as spelled out above. It is an instance of misguided scholarship and confused thinking.