Sunday, November 19, 2017

Employability: A National Imperative

December 4, 2017

Federal Reserve Bank of Boston

Boston, MA

Next Generation Transfer: New England’s Evolving Transfer Landscape

January 19, 2018

College of the Holy Cross

Worcester, MA

Aligning higher education, business and policymakers to increase the employability of the regions' graduates.

Click here for NEBHE's data-rich analyses of

Also, see the Trends & Indicators

## Math Task Force’s Bad Calculation

by Mike Winders and Richard Bisk

September 30, 2014

The number of incoming college students who require development mathematics coursework is a national problem. As reported by the National Center for Educational Statistics, 42% of students entering college for the first time in fall 2003 took a developmental math course. At our institution, Worcester State University, 54% of students entering in fall 2004 placed into developmental math. This is an enormous area of concern for a number of reasons; there is a monetary cost to students who must take courses for which they are not granted credit, and colleges and universities must pay instructors to teach such courses. In this article we examine a policy change recently implemented by the Massachusetts Board of Higher Education (MBHE) that seeks to address this issue by drastically changing how students are placed into their first college-level math class.

The current process for placing incoming students into their first mathematics course was formed by the MBHE’s 1998 Common Assessment Policy of Massachusetts, which was based on a report from the Mathematics Assessment Task Force. All members of the committee had a background in mathematics, and half of the members held mathematics faculty positions. In addition, detailed minutes of all meetings were included in the report, as were all votes. The policy dictated that all incoming students were required to take the Accuplacer Elementary Algebra exam, which covers topics found in an Algebra I course (typically taken in eighth or ninth grade). In addition, the policy mandates a “cut score” that determines whether or not an incoming student is placed into a developmental course.

A decision to change this process was made in October 2013 when the MBHE decided to implement four “primary and comprehensive recommendations” made by the Task Force on Transforming Developmental Math Education. The MBHE report indicates that the task force had 17 members. In contrast to the 1998 task force, only five of the members are listed as having current positions that involve math instruction. One other member is a former mathematics professor. In further contrast, the report includes neither minutes nor records of votes. Attempts to obtain them suggest that there are no such records.

The task force’s first recommendation is that recent high school graduates whose high school GPA is 2.7 or higher are exempt from the initial placement exam (currently Accuplacer) and should be placed directly into the lowest college-level math course appropriate for their chosen pathway of study. Further, high school graduates whose high school GPA is lower than 2.7 but higher than 2.4 and who have successfully passed four math courses including math in their senior year are exempt from the initial placement exam and should be placed directly into the college-level math course appropriate for their chosen field of study. To clarify, this refers to overall high school GPA, not just GPA in high school math classes. We strongly disagree with this recommendation.

Our first point of contention is the stance the task force takes regarding developmental coursework. The report states that “students who enroll in developmental coursework are less likely to graduate … these students often become discouraged and never reach a point where they even attempt an entry-level course.” The first part of the statement should surprise no one. Along the same lines, first-year students who fail courses are less likely to graduate. Should we deal with this by banning failing grades for first-year students?

The second part of the statement suggests that the best way to eliminate the discouragement a student experiences when faced with developmental work is to allow them automatic placement to a credit-bearing course. Inherent in this conclusion is the assumption that such a student will succeed in that course, even though, in many cases, they will lack vital background skills necessary for success (skills they would be taught in a developmental math course). The task force neglects to contemplate how discouraging it can be for a student to repeat multiple times the same course for which they are not prepared.

Our concern about students struggling in courses for which they are not prepared is based in experience. A few years ago, our administration waived placement test requirements for transfer students. During the summer of 2013, we had a pre-calculus student who transferred in a course equivalent to college algebra from a community college. Under the old policy, this student would still be required to take Accuplacer; under the new policy, however, the student was allowed to register for pre-calculus. This student worked hard, asked questions, and scored 4% on the first midterm exam. He subsequently withdrew from the course. In discussions with him, we suggested that he retake college algebra, because he was clearly lacking the skills necessary to allow him to succeed in pre-calculus. He opted instead to try pre-calculus again in the fall. Once again, he worked hard, asked questions and this time, he scored 6% on the first midterm. He again withdrew from the course, and finally, after a year, he agreed to sit in a college algebra course to build up his skills. This student was done an incredible disservice by being deemed prepared for a course for which he was clearly not ready. He wasted hundreds of dollars, and we can say from direct conversations that he was incredibly discouraged from the experience. We understand that this anecdote does not specifically apply to developmental coursework, but our overall point is that placement tests serve a vital purpose, and when students are deemed prepared for the class of their choosing without testing their basic skills, we fear such situations will occur far more frequently.

Another point of contention we have with the task force recommendations is the exemption from placement testing of all incoming students with a high school GPA of 2.7 or above (and in the case of students who passed four math classes including one in their senior year, a high school GPA of 2.4 or above). We again worry that such a policy will result in a multitude of students who are woefully underprepared for their first college-level math class. Under this policy, a student who receives a D- in every math class he took in high school (and in the first case, who didn’t even take a math course in his senior year) will be deemed ready for college-level math so long as his high school GPA is 2.7. Anyone who assumes that a student who received D’s in all of their high school math classes and who did not even take a math class in their senior year will succeed in their first college-level course either grossly underestimates the rigor of a college-level mathematics course or expects that the standards in such courses will be lowered to accommodate so many ill-prepared students. We have spoken with several members of the task force to try to ascertain the research base for this recommendation. No specific information was provided. Responses ranged from “I missed that meeting” to “Ask DHE staff.”

The College Board’s 2013 State Profile Report for Massachusetts provides information about the high school GPAs of college-bound senior. The average GPA of those who provided this information was 3.23. Only 13% indicated that they had GPAs below 2.7. This suggests to us that this is an extremely low threshold.

We also have serious reservations about the apparent research basis for the task force recommendations, namely “Predicting Success in College: The Importance of Placement Tests and High School Transcripts.” The 2012 paper by Clive Belfield and Peter M. Crosta of the Community College Research Center (CCRT), Teacher’s College, Columbia University examined only community colleges, and did not include any data from the Commonwealth of Massachusetts. Secondly, we are concerned with how the task force interpreted the paper. In its draft report, the task force summarizes the paper as follows:

This is a misleading summary, as the task force seems to confuse predictors of college success with accurate course placement. While they are related, a placement test does not tell whether someone will be successful in college math classes, but rather if they have the knowledge base to be successful. Before we implemented our current placement program at WSU, students were able to enroll in college-level math classes regardless of their placement test scores. Not surprisingly, we saw very high failure rates among these students in a broad range of classes.

But perhaps most importantly, the CCRT paper’s authors acknowledge that their work on validity metrics is based on extrapolation. In particular, since students who score below the Accuplacer cut score are placed into a developmental class, there is no direct data on how such students would fare in a college-level class were they directly placed into one. So to predict how such students would fare were they directly placed into a college-level course, extrapolation must be used (see Appendix for hypothetical examples).

According to our first hypothetical example, we can linearly extrapolate below the cut score to “conclude” that 60% of students who score a 20 on Accuplacer (for reference, the cut score is 82) would pass their first college-level math course. One can see how if this conclusion were, in fact, valid, an argument could be made for eliminating placement testing. Unfortunately, there is absolutely no evidence to support a curve of this shape for scores on the placement test below the cut score. To further illustrate the dangers of extrapolation, consider the next set of scatterplots in our Appendix that plot median heights of boys ages 2 to 20. Suppose we had never seen a boy under age 15 and we only had data for boys ages 15 to 20. As in our above hypothetical, we could extrapolate below the age of 15 to “conclude” that a 2-year-old child would be over 5 feet tall!

We also question why community colleges and state universities are being subject to the same recommendations. State universities and community colleges have major differences, ranging from mission statements to student population and demographics. Specifically, community colleges offer open enrollment, which leads to huge numbers of unprepared students in comparison to state universities. According to data in the task force’s report, in fall 2010, 53% of incoming community college students required developmental math education, compared with 23% of incoming students at state universities. It also seems that the composition of the task force itself slants heavily toward a community college perspective. Of the 17 members of the task force, over half are affiliated in some way with a community college; on the other hand, only one member is currently a faculty member in the mathematics department of a state university.

In addition, the majority of research (nine of 17 papers) used to support the task force’s recommendations originates from the CCRT at Columbia University. In fact, even though the task force cites 17 resources to support its recommendations, they actually originate from only three sources: CCRT, Jobs for the Future, and Complete College America. We certainly understand the need for reform at the community college level and we appreciate the vitally important role that community colleges play. However, we are very concerned that state universities are being subject to the same recommendations as community colleges, when it is apparent that voices from the state university system were in the vast minority with regard to task force membership and research basis.

In the section of the task force’s report titled “Charge to the Task Force on Transforming Developmental Math Education,” four areas were cited as being highlighted in the 2011 Final Report of the Working Group on Graduation and Student Success Rates. Under the first area, “Research and Education,” a bullet point states: “Review innovative practices currently in place at colleges within and outside of Massachusetts and create initiatives which successfully scale up best practices across multiple campuses.” However, the successful program we have at Worcester State was ignored in the task force report. Why? As mentioned above, in 2004 we had a failing entry-level program, with 54% of our entering students placed into developmental math and only 30% of these students passing their developmental math course. By working closely with our administration, we embarked on a data-driven redesign of the program, including careful statistical analysis of the effectiveness of our changes. Through efforts to increase students’ awareness of the placement process, as well as improve their mathematical preparation, the percentage of entering students requiring remediation decreased from 54% in 2004 to 24% in 2006. Additionally, in our redesigned developmental math classes, pass rates increased from 30% in 2004 to 80% in 2009, where they have remained. (For more detail on our program, see

Successful Developmental Math: “Review-Pretest-Retest” Model Helps Students Move Forward, published inThe New England Journal of Higher Education. It is surprising that this paper was not included as a reference in the task force report.)We feel that implementation of the task force recommendations will result in either pressure to lower standards in entry-level math courses or increasing numbers of students failing their first college-level math course. We understand and commend the desire of the task force to improve developmental education, but there is scant evidence that these recommendations will have that effect.

Mike Windersis associate professor of mathematics at Worcester State University.Richard Biskis a professor of mathematics at Worcester State University and was math department chair from 2004-2012.## Share and Enjoy

Tags: math, remediation, Worcester State University

## 7 Responses to “Math Task Force’s Bad Calculation”

One of my pet peeves is mathematics education professors misidentified as mathematics professors (examples abound). This is what happens when the real thing looks at what's happening. The recommendation was basically to extend our all-to-common outrageous level of grade inflation to college and university admissions. What they really want is for us to extend it all the way through "graduation".

Wayne Bishop

Professor of Mathematics

California State University LA

Mathematics is a structured subject---material in one course builds on knowledge from prior courses. And that knowledge is indispensable. Students who have not learned Algebra I well enough are not prepared for more advanced course work. A policy that suggests this is not the case makes no sense. The Massachusetts Board of Higher Education should reconsider their new policy.

Solomon Friedberg

Professor of Mathematics

Boston College

It's not hard to predict the effects of basing the requirement for a developmental math course on overall HS GPA: the percent of students taking developmental math courses will drop, and the percent of students failing credit bearing math courses will increase. On paper it will look like the Commonwealth has reduced the embarrassing percentage of students in developmental math courses, while in reality no student will benefit from this sleight of hand. This is not how we should address the difficult problem that math education at all levels in Massachusetts is insufficiently preparing our students.

Steve Rosenberg

Professor of Mathematics

Boston University

I recall the anecdote about the man who drowned in a river whose average depth was 1 foot. Anecdotes such as this abound and the point is that averages can be very deceiving. Consider the student who has taken 5 equally-weighted tests and obtained an average of 80 points per test. All we can conclude form that statistic that the student accumulated a total of 400 points on the 5 tests. It is possible that the student received a grade of 80 on each test but it is also possible that the student got 100 points on four of the tests and 0 points on the fifth test. In the same vein I worry that the student’s 2.4 GPA might have been 3.0 had the math grades not been included in computing the GPA. And if D is considered to be a passing grade, the student could have compiled a GPA of 2.4 or better without getting any grade greater than D in mathematics.

In an analogous vein, how often have we heard students exclaim “I could do everything in the course except the word problems”? Unfortunately in the “textbook of life” there are only word problems! Too often teachers eliminated the chaos that word problems injected in to the mathematical lives of their students by ensuring that there were few enough “word problems” on a test so that students could receive a passing grade even if they failed get a single correct answer to a word problem. I am wondering whether the BHE decision will create a similar situation, at least in the sense that there will be many more students who will fail to pass their first for-credit math course at the college level.

There is no doubt that getting developmental mathematics courses right is one of the most difficult challenges facing us as we try to meet the national demand for broadening the entryway to STEM careers. The recommendations of the Task Force on Transforming Developmental Math Education amount to giving up on this important task, for a majority of the students now served by these courses, and the MBHE erred in accepting its recommendation. The flaws in the report's use of research findings would merit an F in any basic statistics course. A professional reading of the

research data and contemporary efforts in this domain would have reached quite a different conclusion. In place of this regressive position, the Massachusetts Board of Higher Education would do better to consider alternatives, such as ebmracing the excellent New Mathways Project, created

and supported by University of Texas's Dana Center and adapted by the State of Texas.

[…] Massachusetts will stop requiring a placement test for new students with a 2.7 grade point average…. Those with a 2.4 grade point average who’ve passed four years of math also will be placed in college-level math. […]

As a parent, my two issues are we have to pay for these developmental math courses where the student gets no credit, and the student in an effort to catch back up to the average five course per semester to graduate in four years must now take SIX courses in one semester and the workload in that semester might be enough where all courses suffer. I think the simple answer is to allow these developmental math courses to count for credit under the category of an elective. So instead of allowing credit for underwater basketweaving, allow them credit for a course that is intended to actually help them.