tag:blogger.com,1999:blog-7718462793516968883Tue, 15 Apr 2014 17:07:13 +0000AngryMath"Beauty is the Enemy of Expression"http://www.angrymath.com/noreply@blogger.com (Delta)Blogger154125tag:blogger.com,1999:blog-7718462793516968883.post-5159581409361025698Tue, 15 Apr 2014 09:00:00 +00002014-04-15T13:07:13.238-04:00Multiple Choice ChancesLet's say you have a final-exam assessment that is a multiple-choice test, with 25 questions, each of which has 4 options, and requires a 60% score (15 questions correct) to pass. As one example, consider the uniform CUNY Elementary Algebra Final Exam (<a href="http://www.ccny.cuny.edu/testing/cunyelementaryalgebrafinalexam.cfm">link</a>).<br /><br />How robust is this as an assessment of mastery in the discipline? As a simple model, let's say that any student definitely knows how to answer N types of questions, but is randomly guessing (uniform distribution over 4 options) for the other questions. Obviously this abstracts out the possibility that some students know parts of certain questions and can eliminate certain choices or guess based on the overall "shape" of the question, but it's a reasonable first-degree model. Then the chance to pass the exam for different levels of knowledge is as follows:<br /><br /><div class="separator" style="clear: both; text-align: center;"><a href="http://1.bp.blogspot.com/-gg2NqJrOnwk/U0dUCsF0vDI/AAAAAAAAC2g/Ey7Ce8Wnt7Y/s1600/MultipleChoiceChance.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="http://1.bp.blogspot.com/-gg2NqJrOnwk/U0dUCsF0vDI/AAAAAAAAC2g/Ey7Ce8Wnt7Y/s1600/MultipleChoiceChance.gif" height="320" width="247" /></a></div><br />Obviously, if a student really "knows" how to answer 15 or more questions, then they will certainly pass this test (omitted from the table). But even if they only know half of the material in the course, then they will probably pass the test (12 questions known: 67% likely to pass). Of students who only ever know about one-third of the basic algebra content, but retake the class 3 times, about half can be expected to pass based on the strength of random guessing on the rest of the test (9 questions known: 19% likely to pass; over 3 attempts chance to pass is 1-(1-0.19)^3 = 1-0.53 = 0.47). <br /><br /><br />http://www.angrymath.com/2014/04/multiple-choice-chances.htmlnoreply@blogger.com (Delta)0tag:blogger.com,1999:blog-7718462793516968883.post-7837150053487510452Thu, 03 Apr 2014 09:00:00 +00002014-04-03T05:00:04.220-04:00Meta-Research Innovation CentreInteresting article about Dr. John Ionnidis at Stanford founding the "Meta-Research Innovation Centre" to monitor and combat weak and flawed statistical methods in science research papers, especially medicine. Good luck to him!<br /><br /><div style="text-align: center;"><a href="http://www.economist.com/news/science-and-technology/21598944-sloppy-researchers-beware-new-institute-has-you-its-sights-metaphysicians">http://www.economist.com/news/science-and-technology/21598944-sloppy-researchers-beware-new-institute-has-you-its-sights-metaphysicians</a></div><div style="text-align: center;"><br /></div><div style="text-align: center;"><br /></div>http://www.angrymath.com/2014/04/meta-research-innovation-centre.htmlnoreply@blogger.com (Delta)0tag:blogger.com,1999:blog-7718462793516968883.post-3362312904053430190Thu, 27 Mar 2014 09:00:00 +00002014-03-27T05:00:10.090-04:00Gears of WarWhen I was a kid, one of my favorite pastimes was the Avalon Hill wargame <i>Bismarck</i> about fighting ships in World War II (see it reviewed on my gaming site <a href="http://deltasdnd.blogspot.com/2010/10/retrospective-bismarck_11.html">here</a> and <a href="http://deltasdnd.blogspot.com/2011/05/bismarck-for-two.html">here</a>). In junior high school, at some point my English teacher asked me what I wanted to do as a career, and was completely apalled when I said "I want to join the Navy and control the main guns on a battleship". (I think I'd share her dismay if someone told me something like that today.)<br /><br />Anyway, over at Ars Technica, and wonderful article has been written by Sean Gallagher (former Navy officer and IT editor) on exactly how the fire control systems on those ships did their jobs -- solving 20-variable calculus problems in real-time (accounting for moving, pitching, rolling, recoiling, Cariolis-spinning projectiles on both ends) with shafts and gears, with accuracy that is hard to beat even today with digital computers and GPS-driven rocketry. There are lots of insightful videos about the components and gears used to do input, sums, multiplies and divides, and spinning disks that can do complicated functions like trigonometry and more. <br /><br />To me, this stuff is completely like crack. Check it out:<br /><br /><div style="text-align: center;"><a href="http://arstechnica.com/information-technology/2014/03/gears-of-war-when-mechanical-analog-computers-ruled-the-waves/">http://arstechnica.com/information-technology/2014/03/gears-of-war-when-mechanical-analog-computers-ruled-the-waves/</a></div><br />(Also: Further commentary and links at recently-established news site <a href="http://soylentnews.org/articles/14/03/18/137232.shtml">SoylentNews</a>.)<br /><br /><br />http://www.angrymath.com/2014/03/gears-of-war.htmlnoreply@blogger.com (Delta)0tag:blogger.com,1999:blog-7718462793516968883.post-8919420559055557940Thu, 20 Mar 2014 09:00:00 +00002014-03-20T05:00:09.783-04:00FiveThirtyEightNate Silver (statistician who famously predicted all 50 states voting in the last election) recently expanded his FiveThirtyeight blog to a full-blown "data journalism" site. His first post was a manifesto on data, science, statistics, politics, journalism, and honest storytelling in general. I agree with almost all of his observations here. They guy really knows his stuff and has a fiery passion for his particular mission. Great stuff.<br /><br /><div style="text-align: center;"><span style="font-size: large;"><a href="http://fivethirtyeight.com/features/what-the-fox-knows/">http://fivethirtyeight.com/features/what-the-fox-knows/</a></span> </div><br /><br />http://www.angrymath.com/2014/03/fivethirtyeight.htmlnoreply@blogger.com (Delta)0tag:blogger.com,1999:blog-7718462793516968883.post-2469127718298387404Mon, 10 Mar 2014 09:00:00 +00002014-03-10T17:43:00.985-04:00Faulty FactoringHere's something I think I see a few times in any college algebra class: a really weird way of accomplishing quadratic factoring. (More generally, this might go in a larger file of "things students swear are taught by other instructors which are semi-insane" -- including whacked-out order-of-operations, keep-change-change for negatives, the idea that -4<sup>2</sup> means (-4)(4), etc.).<br /><br />Anyway, let's say we want to factor what I call a "hard quadratic", i.e., Ax<sup>2</sup>+Bx+C, in integers, with A≠1 (hence "hard"). I prefer the method of grouping: i.e., factor AC so it sums to B, use those factors to split the term Bx, and then factor the four terms by grouping. Pretty straightforward.<br /><br />But here's what a few students will insist on doing every semester: (1) Find factors of AC that sum to B; call these factors u & v (so that step is the same); (2) Write the expression (Ax+u)(Ax+v); (3) Look for a GCF of A in one of those binomials and strike it out.<br /><br />Here's an example: Factor 5x<sup>2</sup>+7x−6.<br />Step (1): Note AC = −30 = (10)(−3), factors which sum to B = 7.<br />Step (2): Write (5x+10)(5x−3)<br />Step (3): Divide the first binomial by 5, producing (x+2)(5x−3).<br /><br />So while this procedure does produce the right answer, what irks me tremendously is that the expression written in step (2) is not actually equal to either the original expression or the answer at the end. (Compounding this issue, students will nonetheless usually write equals signs by rote on either side of it.) Riffs on this procedure would be to write something like this on sequential lines, if you can follow it:<br /><br />5x<sup>2</sup>+7x−6 → x<sup>2</sup>+7x−30 → (5x+10/5)(5x−3/5) → (x+2)(5x−3)<br /><br />Again, the primary grief I have over this is that none of these expressions are equal to any of the others, and the students using this procedure are always oblivious to that fact. Second issue: They're likely to trip up over a non-elementary problem where the factor A does not appear in either of the binomials, e.g.: 4x<sup>2</sup>+4x+1 = (2x+1)(2x+1). Third issue: If there's a GCF in the quadratic itself and you overlook that, the standard grouping technique will still work (even if it's not the easiest way to do it), whereas I suspect users of this technique will be prone to incorrectly striking out any GCFs they discover at the end of the process. <br /><br />Now, technically you could modify this and turn it into a correct procedure this way: Note that for quadratic Ax<sup>2</sup>+Bx+C, values u & v satisfy uv=AC and u+v=B if and only if Ax<sup>2</sup>+Bx+C = 1/A(Ax+u)(Ax+v). (Proof: 1/A(Ax+u)(Ax+v) = 1/A(A<sup>2</sup>x<sup>2</sup> + (u+v)Ax + uv) = Ax<sup>2</sup> + (u+v)x + uv/A and equate coefficients). So you could find u & v as usual, then write this latter expression, and simplify. The 1/A does always cancel out, but I've never seen a student actually write that factor in the second step.<br /><br />So what I always do if I see this on a test in my college algebra class is to take half credit off for the problem and note that the intermediary expression is "false", i.e., not equal to what comes before or after. This then becomes an opportunity to discuss with the student why that's improperly written math -- went well in my most recent semester, but I can easily see that becoming more combative in a remedial algebra class.<br /><br />Have you seen this (common) faulty factoring procedure in your classes? What do you as a correction for it, if anything?<br /><br /><br />http://www.angrymath.com/2014/03/faulty-factoring.htmlnoreply@blogger.com (Delta)4tag:blogger.com,1999:blog-7718462793516968883.post-2737796083155192294Wed, 05 Mar 2014 20:16:00 +00002014-03-05T15:16:35.243-05:00Presenting at Johns HopkinsHere's one of these topics that merges my great interests in teaching & gaming, so I have no choice but to cross-post about it here and on my gaming blog.<br /><br />Last week I had the opportunity to visit Johns Hopkins University, at the invitation of Peter Fröhlich to speak to his Video Game Design Project class in the computer science department there (run jointly with art students from the nearby MICA). A great talk and chance to meet with his students and network a bit with Peter, Jason from MICA, as well as one of my idols from old-school role-playing game publishing. <br /><br /><a href="http://deltasdnd.blogspot.com/2014/03/presenting-at-johns-hopkins.html">Bounce on over to my gaming blog for the details! </a><br /><br />http://www.angrymath.com/2014/03/presenting-at-johns-hopkins.htmlnoreply@blogger.com (Delta)0tag:blogger.com,1999:blog-7718462793516968883.post-4799740675272678431Mon, 24 Feb 2014 10:00:00 +00002014-02-24T05:00:06.774-05:00Research in College Algebra Basic SkillsHere's something that I'm finding frustrating: for all the mountain of ink spilled on the issue of remedial math in colleges (including enormous numbers taking them, the fact that it's <i>the</i> critical determination of whether people get a college degree or not, dim prospects of existing placement tests, etc., etc., etc.), when I search for papers where someone has tried to correlate specific math skills of incoming students to success in college remedial algebra -- <i>I come up totally empty</i>. <br /><br />Weirdly, I can find studies that correlate specific diagnostic test questions in basic math skills to other classes. Here's one relating specific math skills to success in college<a href="http://www.amstat.org/publications/jse/v14n2/johnson.html"> statistics classes</a>. Here's <a href="http://www.amstat.org/publications/jse/v19n1/lunsford.pdf">another</a>. Here's a study relating basic math skills to success in <a href="http://econ.aplia.com/downloads/mathpaper011602.pdf">economics classes</a>. <br /><br />But predicting success in basic algebra classes? I'm coming up totally empty. I'm truly bewildered at this -- part of me can't possibly believe that no one has published results like that, but part of me is stewing from returning to this futile search many days over and over again.<br /><br />Does anyone know of such research linking specific skill questions to success in college remedial algebra? Or <i>any</i> college algebra classes?<br /><br /><br />http://www.angrymath.com/2014/02/research-in-college-algebra-basic-skills.htmlnoreply@blogger.com (Delta)0tag:blogger.com,1999:blog-7718462793516968883.post-8605872648395851517Mon, 17 Feb 2014 10:00:00 +00002014-02-17T05:00:08.078-05:00Precipitation ProbabilityThis winter module I've had a batch of students in my introductory statistics course who are so aggressively intelligent that they've spotted every single spot where I had any gray area or ambiguity in my lectures. In places I do this knowingly to simplify the subject, and prepare backup answers in case anyone asks -- this semester is the first time where every single one of backups got used, and then some. This will definitely benefit my classes in the future, and in fact, I learned a few things myself along the way. For example:<br /><br />At the end of the probability concepts section, the major thing I want students to do is to interpret probability statements (which for some is the most difficult part of the course, never having encountered probability concepts before). I give a quiz question on the classic weather forecast precipitation probability: "Interpret this probability statement: 'There is a 40% chance of rain today in the New York area'". So personally, I always took this to mean that there was a 40% chance of getting any rain at all in New York today (40% chance for a drop of measurable rain somewhere in New York; i.e., over many days like today 40% of such days will get a drop of rain or more in New York). <br /><br />But one of my students not only started researching this on her own, <b>she actually called the New York weather service</b> to ask a meteorologist how this was computed. She still didn't get the interpretation quite right (one of the few questions she missed all semester), but the discussion was enlightening for both of us.<br /><br />The truth is that the weather forecast statement is in regards to rain at <i>any random location in New York</i>, not actually the rain for New York as a whole. I suppose that is really a more useful statement, after all. The publicized percentages are computed by multiplying the expected coverage area percent by the probability of rain occurring in that area (so if it's 40% likely that 100% of the area gets rain, you report the same result as an 80% chance that 50% of the ground gets rain). Therefore: What's being reported is the chance that any arbitrary point in New York gets measurable rain; i.e., 40% indicates that for any random point in New York, if we observe many days with conditions like today, 40% of such days get a measurable drop of rain on that point-location.<br /><br />Links to more information:<br /><ul><li>Comments from the National Weather Service, reposted at the University of Texas at Austin website: <a href="https://www.utexas.edu/depts/grg/kimmel/nwsforecasts.html">here</a>.</li><li>Video from a meterologist in Boulder, including citation to the 2005 Gigerenzer et. al. paper in Risk Analysis which surveyed people for their understanding of these statements (where I got my quiz question in the first place):<a href="http://blog.petermcgraw.org/2010/09/how-to-better-understand-your-weather-forecast-or-what-does-a-40-chance-of-rain-really-mean/"> here</a>.</li></ul>http://www.angrymath.com/2014/02/precipitation-probability.htmlnoreply@blogger.com (Delta)0tag:blogger.com,1999:blog-7718462793516968883.post-7904191170837223285Mon, 03 Feb 2014 10:00:00 +00002014-02-03T17:50:39.652-05:00Teacher Guilt and Grading WorkloadIs this a closely-kept secret? I think that all of the college math instructors I know really intimately (I'm counting about four, including myself) have at some point admitted to an overwhelming stack of grading assignments that they've procrastinated on, and a painful amount of guilt that they've experienced over that apparent failure. One instructor told me in passing once that he basically had a nervous breakdown over Thanksgiving break, over not being able to accomplish all the grading he needed to. Which was freakishly familiar, because the exact same thing basically happened to me, several years ago.<br /><br />I think this advice (like much of what I write here) goes out to new instructors, just starting their career, on the off chance they internet-search for this one day. So here we go: Your students absolutely deserve prompt feedback on their work, ASAP. But you also deserve a reasonable quality of life, not absolutely drowning in work and falling behind all the time. If you're not getting your grading done promptly, then you've got to be sensitive to that and <i><b>change your assignments such that they're gradable in reasonable time</b></i>.<br /><br />Note this goes regardless of whatever pedagogical fashion is currently happening, or whatever suggestions you've received for proper assignments to give. It <i>must</i> be doable in the time that you have, full stop. The top priority basically always has to be honesty about time management; if you can't do it, then you can't do it, and you need to admit that and change it.<br /><br />Here's how it started happening for me: When I was graduate student and given a few sections of college algebra to teach for the first time, my adviser (who was a really great guy) recommended what he did for homework -- have students keep a notebook journal of homework problems, turn in the notebook every week or two, and "check it off". Well, this turned into me lugging a giant box of notebooks home from school in the second week and it just sitting at home. Probably I attempted to "check it off" a few times and was so appalled and bewildered at the entirely undecipherable, jumbled scribblings in the first several books (possibly with answers from the back of the book transcribed at the end) that I just didn't see how I could possibly read, decipher, assess, and adjudicate this giant mass of outrageous nonsense (as I now interpret it). My mentor said it was "easy" for him, but it seemed utterly impossible for me.<br /><br />Moreover, what counted as acceptable work? If a student came and challenged me on not being "checked off", how could I defend that, or explicate what line needed to be crossed? All absolutely reasonable concerns. Well, then I was committing to assessing each problem in some fashion individually, scoring up a particular ratio of acceptable problems for a pass/fail check, providing rubrics, peering closely at every line of written transformations, etc. Or at least in theory I was: it was impossible, and every day I'd slump into class and mumble a shameful excuse about why the notebooks weren't coming back, and probably get around to it about twice a semester. A task that put dread over me literally all semester long.<br /><br />And you know -- this tradition more or less lasted for some years after I started teaching professionally for work. The process evolved by my first cutting down the number of problems to a fairly specific list that I expected to assess problem-by-problem. I went and ordered custom rubber pass/fail stamps to try and expedite the system; but an ever-present problem was even being able to find specific problems on some students' jumbled-up paper. Then I reduced it to about 3 specific problems I assigned each week on a one-page worksheet I designed, with dedicated space to put each problem, a completely worked-out example to show the correct format, and exactly 10 lines of work that I would assess line-by-line (passing would need 6/10 lines exactly correct). But even this requirement overwhelmed students in the basic algebra class, and there was constant combativeness around the assessment. Some could never learn to use an equals sign on each line. Most classes would find one person who understood the assignments and all be copying their page when I walked into the classroom.<br /><br />There was some weekend a few years ago where it felt like I went basically berserk over the issue -- I just couldn't deal with it any more. I think this would be when I first switched from part-time to full-time, so my courseload doubled. Even the reduced one-page assignments were not manageable, I still had a sense of dread all the time, it didn't seem to help my students at all, and mostly all I got for my feedback was grief.<br /><br />End of the story -- those assignments simply had to freaking stop. The most honest truth that I finally realized was this: my remedial students need a monumental amount of work and practice to overcome their deficiencies, and I don't have anywhere near the amount of time in my life to assess all of that vast amount of work that they have to do. The responsibility has to be put on them -- even if the majority of remedial math students are going to fail at the challenge. <br /><br />The new protocol is this: There is a list of homework assignments that they're expected to do, all with answers to check at the back of the book, and if something doesn't work right or they have questions, then they can ask in class. I don't collect or grade this homework; there is too much for me to deal with. I'm usually needling just a bit at the start of every class; if no one asks about any exercises then there's some uncomfortable silence that I let settle. But usually I get one or two students who are asking questions, and then my time spent responding is actually helping someone who <i>does</i> want it, as well as the rest of the class, and also setting an example for proper study skills. As semesters pass, it seems like I get somewhat better traction and momentum with this, with more students actually participating. (I guess I write this tonight after multiple students in my statistics class asked about a bungled textbook problem that's been on my syllabus for 3 years now -- tonight was the first time anyone brought it up, slightly embarrassing for me, but otherwise beneficial to my future classes.) Also, I use our Blackboard system to deliver 5 multiple-choice quiz questions to the students every week -- entirely automated grading, so it pops up in my digital gradebook without any effort on my part, complete with comprehensive statistics on what the hardest parts were -- keeping the students to some required attention every week, without spending any more of my home or class time on the process.<br /><br />As another example, this semester I switched my college algebra tests from multiple-choice to open-response (grading on quality of writing/justifications as well as raw answers). Right before I did this, I had another instructor warn me to not make it too burdensome on myself (a reasonable concern!). But I didn't just add work for myself: I <i>cut</i> the size of the tests to a level that would be easy for me to grade. Instead of 20-question multiple choice tests (like most instructors here use), I give 10-question open-response tests. Namely, the hardest 10 questions in the block -- no rinky-dink warm-up problems (like trivial linear equations or simply adding polynomials). But the scoring system is simply for each each question: 1st point for the correct answer, and a 2nd point for well-written justification (or maybe 1 point for a single error, 0 points on the second error). Each point is 1/20 = 5% of the test, absolutely laser-fast to score and add up. (There's no fiddling with granularities less than 5%; I don't have time for that.) It takes me about an hour, maybe two, to grade and give feedback to all the problems for all the students in a section on one test cycle.<br /><br />So here's where I am today: When I give a test <i><b>I cannot wait to get started grading it</b></i>. I'm almost over-eager to see how my students are doing, and curious to see what's working well and what we can brush up and improve in the future. I know that the grading will go quickly and be productive, and I will be getting data about how the class is progressing very soon. Separately, I'm almost addicted to checking in on the online quiz progressions, following the statistics of which problems are hardest.<br /><br />The workload has flipped from dreadful to highly desirable, and I look forward to getting student work whenever I can now. I think I've had some test problems that were hard to grade (I can't think of what they were right now), but then I pull them out of rotation and replace them with something more reasonable to assess. I basically don't have arguments about grading anymore; the points are specified in advance on a practice test, and it's all very easy to see where everything is coming from.<br /><br />Last semester, I actually had one student express surprise and near-disbelief at how quickly I got graded tests back to students; namely, the very next day without fail. She asked me how I could do that when all of her other professors took at least a week to do the same thing. My answer was something like I was really curious about how my students were doing and couldn't wait to find out. (Truth be told, my tests are usually graded a few hours after I give them, and results are online usually around 2-3am after my night classes.)<br /><br />So I offer that as a success story of someone who's gone from crushing years-long ever-present guilt and dread over grading, to where I almost can't get started at it soon enough to satisfy me. The key is to budget time first, to be honest about what you can do, and to <i><b>cut and design assignments to a level where you can grade them with a sense of joy</b></i>. <br /><br />Have we all gone through this trip through the valley of dread? Have you?<br /><br /><br />http://www.angrymath.com/2014/02/teacher-guilt-and-grading-workload.htmlnoreply@blogger.com (Delta)8tag:blogger.com,1999:blog-7718462793516968883.post-2036456653769082315Mon, 27 Jan 2014 10:00:00 +00002014-01-27T11:40:41.125-05:00Excellent Exercises − Completing the SquareThe is the first of an occasional series that I'd like to post about intelligent exercise design for use in a math class (whether as part of a presentation, homework, or test). My primary point is that if someone just thinks that <i>they</i> can solve problems, and walks into a classroom and starts making up random problems to work on -- disaster is sure to strike. There are so many possible pitfalls and complications in problems, and such a limited time in class to build specific skills, that you really have to know <i>absolutely every detail</i> of how your exercises are going to work before you get in the classroom. Not expecting to do that is basically BS'ing the discipline. <br /><br />So in this series I'd like to show my work process and objectives for specific sets of exercises that I've designed for my in-class presentations. Are my final products "excellent"? Maybe yes or no, but certainly that's the end-goal. The critical observation is that a great deal of attention needs to be paid, and the precise details of every exercise have to be investigated before using them in class. And that some subject areas are surprisingly hard to design non-degenerate problems for.<br /><br />For this first post, I'll revisit my College Algebra class from last week, where I lectured on the method of "completing the square" (finding a quantity c to add to x^2+bx such that it factors to a binomial square, i.e.: x^2+bx+c = (x+m)^2... which of course is solved by adding c=(b/2)^2.). As per my usual rhythm, I had four exercises prepared: two for me and two for the students. Each pair had one that would be worked entirely with integers, and a second that required work with fractions. The first three went as expected, but the fourth one (worked on by the students) turned out much harder, such that only 3 students in the class were able to complete it (which sucks, because it failed to give the rest of the class confidence in the procedure). Why was that, and how can I fix it next time?<br /><br />First thing I did at home was turn to our textbook and work out every problem in the book to see the scope of how they all worked. Here I'm looking at Michael Sullivan's <i>College Algebra, 8th Edition</i> (Section 1.2):<br /><br /><div class="separator" style="clear: both; text-align: center;"><a href="http://3.bp.blogspot.com/-B7LhNnZ097E/UuQdxICAzaI/AAAAAAAACxg/ILleolywf1o/s1600/ExcellentExercises-CompletingSquare1.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="http://3.bp.blogspot.com/-B7LhNnZ097E/UuQdxICAzaI/AAAAAAAACxg/ILleolywf1o/s1600/ExcellentExercises-CompletingSquare1.gif" height="400" width="308" /></a></div><br />What we find here is that all of the problems in the book share a few key features. One is that after completing the square, when the square-root is applied to both sides of the equation, the right-side numerator never requires reduction (it's either a perfect square or it's prime). Second and perhaps more important is that the denominator is in every case a perfect square -- so the square-root is trivial, and we never need to deal with reducing or rationalizing the denominators. Third is that with one exception, in the last step the denominators of the added fraction are always the same and need no adjustment (the exception is in #43, where we adjust 1/2 = 2/4; noting that even when combing fractions on the complete-the-square step, I had a few students flat-out not understand how to do that). That does simplify things quite a bit.<br /><br />Now let's look at my fraction-inclusive exercises from class: <br /><br /><div class="separator" style="clear: both; text-align: center;"><a href="http://2.bp.blogspot.com/--SEEQPX_uBA/UuQdxQBnJDI/AAAAAAAACxs/8zW_TXRtXT8/s1600/ExcellentExercises-CompletingSquare2.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="http://2.bp.blogspot.com/--SEEQPX_uBA/UuQdxQBnJDI/AAAAAAAACxs/8zW_TXRtXT8/s1600/ExcellentExercises-CompletingSquare2.gif" height="400" width="308" /></a></div><br />As you can see, item (b) (the one I did on the board) works out the same way, featuring a right-side fraction with a prime numerator and a perfect-square denominator. But item (d) (that the students worked on) doesn't work that way. The denominator of 18, after the square root, needs to be reduced, then rationalized, and that causes another multiplication of radicals on the top; and then to finish things off we need to create common denominators to combine the fractions. That extends the formerly 8-step problem to about 14-steps, depending on how you're counting things. <br /><br />You can see on the side of that work paper that I'm trying to figure out what parameters cause those problems to work out differently. One is that if there's any GCD between the first coefficient and any of the others, then some fraction will reduce and produce non-like denominators in the last step. And that it turn will result in a non-perfect-square denominator on the right after you complete the square (because of adding fractions with initially different denominators). So my primary problem in item (d) was using the coefficients 6, 4, and 9, which have GCDs between the first and each of the others.<br /><br />Finally, here's me trying to find a reasonable replacement exercise, which is harder than it first sounds (of course, trying to avoid all the combinations previously used in the book or classroom);<br /> <br /><div class="separator" style="clear: both; text-align: center;"><a href="http://1.bp.blogspot.com/-NHyU-rGH1MI/UuQhSQkxYuI/AAAAAAAACx4/nD7NVUPp8Z0/s1600/ExcellentExercises-CompletingSquare3.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="http://1.bp.blogspot.com/-NHyU-rGH1MI/UuQhSQkxYuI/AAAAAAAACx4/nD7NVUPp8Z0/s1600/ExcellentExercises-CompletingSquare3.gif" height="400" width="308" /></a></div><br />It took me 4 tries before I was satisfied. The first attempt had a GCD in the coefficients (and thus a denominator radical needing reduction/rationalizing), before I figured that part out. The second attempt fixed that, but accidentally had a reducible numerator radical, which makes it unlike all the stuff before that (√44 = √4*11 = 2√11). The third worked out okay, but I was unhappy with the abnormally large numerator radical of √149, which is a little hard to confirm that it's <i>not</i> reducible (the "100" and "49" kind of deceptively suggest that it is). So on the fourth attempt I cut the coefficients down some more, so the final radical is √129, which I'm more comfortable with.<br /><br />Now we could ask: shouldn't students be able to deal with those reducible and rationalizable denominators when they pop up? In theory, of course yes, but in this context I think it distracts from the primary subject of how completing-the-square works. More specifically, the primary (but not sole) reason we <i>want</i> completing the square is to use in the proof of the quadratic formula -- and coincidentally, neither of those complications appear if you work the proof out in detail (the numerator radical is irreducible, the denominator is a perfect square, and like denominators appear automatically). So as a first-time scaffolding exercise these are really the parts we need. If students were to encounter more complications in book homework on their own time, then that's great, too (although as we've seen in the Sullivan textbook, that doesn't happen).<br /><br />In summary: Completing the square exercises can get extremely bogged down with lots of radical and fraction work if you're not careful about how they're structured in the first place, losing the thread of the presentation when that happens. More generally: It may be necessary to work out <i>every</i> exercise in a textbook, as well as all your in-class presentations, beforehand in order to scope out expectations and challenge level. Hopefully more examples of this on a later date.<br /><br /><br />http://www.angrymath.com/2014/01/excellent-exercises-completing-square.htmlnoreply@blogger.com (Delta)0tag:blogger.com,1999:blog-7718462793516968883.post-8761610168594481853Mon, 20 Jan 2014 10:00:00 +00002014-01-20T05:00:07.926-05:00Show Work vs. Justify AnswersMy current testing protocol is that all of my remedial math classes have multiple-choice tests, but all of my college credit-bearing classes have open-response tests (i.e., <i>not</i> multiple-choice). This is a minor change this year, as previously I felt completely constrained by the various department-level final exams in our system, which are multiple-choice for most everything up through calculus (so as to make it easy for the department staff to score them). <br /><br />Anyway, for the in-class tests that I personally give, I recently grappled a bit with exactly what direction I should give in this regard. Of course many instructors use the phrase "Show your work", so much so that students frequently anticipate that as the direction. But does that address a real issue? Some people's work process is just undeniably crappy: scattered, jumbled, incoherent. While that might indeed be their work process, does it really do them or anyone else any good?<br /><br />What I've recently settled on is this direction: "Justify your answers with well-written math." This gets more to the heart of the matter, that one is using mathematical language to explain <i>why</i> something is to <i>another person</i>. There's a specific syntax and grammar to this (just like French or Russian or anything): any arbitrary "this is the way I do things" doesn't cut it, because we need a shared language to be understood. And it prepares students to read a math book on their own. And help other students in need, and be helped by them. And it allows the instructor to give useful feedback, by identifying a specific logical gap. And probably some other stuff that I'm overlooking right now.<br /><br />So at the level of College Algebra and above, I've started to grade half-credit on this basis as of this semester (for full credit, students need both the correct answer and properly-written math statements showing small-scale steps). Later in Trigonometry they can deal with more formal identity-proofs, etc., but I think this frames the expectations for students properly at an early point. <br /><br />Do you agree that this is a much better directive than "show your work"? Can you think of a better phrasing for the requirement? <br /><br /><br />http://www.angrymath.com/2014/01/show-work-vs-justify-answers.htmlnoreply@blogger.com (Delta)0tag:blogger.com,1999:blog-7718462793516968883.post-1428912393622125923Mon, 23 Dec 2013 18:29:00 +00002013-12-23T13:29:05.870-05:00Yes, And...This winter session I'll be teaching College Algebra, which I rarely do (once a year or less). Students are definitely sharper than in remedial algebra classes, which is a delight, but they're also more honed into "playing the game" of grades for their own sake. That is to say: I get more incessant "will this be on the test?" cries than I do in other classes.<br /><br />One thing I'm doing new this semester is to give open-response tests (not multiple-choice), so that I have the option on grading issues of correct writing format and the like. Or really anything else that comes up as an issue. (As a counter-balance, I'll be giving tests with fewer but more complex questions.) <br /><br />But in conjunction with that, I'm mentally prepping to to try to answer those inquiries with a "Yes, but more importantly..." response. Like: Q: "Will our writing be graded on the test?" A: "Yes, but more importantly, that's how you communicate math to other people, and it's what you should be prepared to read in a math book on your own." Or Q: "Will graphs be on the test?" A: "Yes, but more importantly, it's the faster way to estimate or double-check any answer and avoid mistakes." So it gets the somewhat irritating question out of the way in the first word, and more importantly, it explains why that's really of secondary importance at best. Kind of like in improvisational comedy where you're supposed to respond to any creativity on your partner's part with "Yes, and..." ("and" being logically equivalent to "but", of course). <br /><br />Do you have any clever ways of dealing with cries of "Will this be on the test?<br /><br /><br />http://www.angrymath.com/2013/12/yes-and.htmlnoreply@blogger.com (Delta)2tag:blogger.com,1999:blog-7718462793516968883.post-5693278169330580468Mon, 02 Dec 2013 10:00:00 +00002013-12-05T14:19:37.650-05:00Automatic DrillsI think we all know that certain skills need <a href="http://www.learninginfo.org/automaticity.htm">"automaticity"</a>, that is, such thorough learning and practice that they become automatic, unconscious, instantaneous. For example: Recognizing the letters of the alphabet, reading standard vocabulary words, times tables, negative numbers, etc. If you don't have those basic things working unconsciously, then you inevitably get distracted and make mistakes trying to attend to larger, more full-featured problems.<br /><br />But I've been thinking lately that the expectation and need for these most fundamental skills is often not communicated to our students; in the era that frowns on structure and drills for automatic knowledge, many of our students have never seen such a requirement assessed directly anywhere. Of course, I'm thinking of the times-tables drills that people my age did in the 2nd or 3rd grade, and nowadays may possibly be done in the 8th grade or high school by the more exceptional and dedicated teachers (so I hear). <br /><br />Might it be the case that in any class, there's at least one specific skill that is expected to become automatic, even if many of us overlook communicating and drilling on that? For example, it's occurred to me that we might expect the following regular speed drills to take place:<br /><ol><li>In early grammar school -- Times tables.</li><li>In late grammar school/college remedial arithmetic -- Negative number operations.</li><li>In junior high school/college remedial algebra -- Matching a slope-intercept equation to the graph of its line.</li><li>In statistics -- Estimating the area under part of a normal curve, or interpreting confidence intervals and P-values. (?)</li></ol>I don't know, that last one perhaps I'm reaching too much for a uniform rule throughout all my classes. But I am starting to consider a timed test for those automatic prerequisites on the first day of my classes, and repeated timed tests on the "new" automatic skill in each class. <br /><br />What do you think? Have you used timed drills to communicate the expectation of automatic skills? And for anything other than times tables?<br /><br /><br />http://www.angrymath.com/2013/12/automatic-drills.htmlnoreply@blogger.com (Delta)13tag:blogger.com,1999:blog-7718462793516968883.post-8819920826120134144Mon, 04 Nov 2013 10:00:00 +00002014-01-18T14:16:37.307-05:00Branching Decisions in AlgebraYesterday (as I write this) was a hard day for some students in my several remedial algebra classes. The lesson wasn't a long one (I was done lecturing about 40 minutes into the hour on the two topics), but about 1/3 to 1/2 of the class seemed to run into a brick wall in trying the final exercises on their own. The subject was basic factoring of polynomials, and after two days on the subject I had this combined procedure written on the board:<br /><blockquote class="tr_bq"><b>Factor completely process:</b><br /><b>(1) Factor GCF if possible.</b><br /><b>(2) Try DOS for binomial, or SQ for trinomial.</b></blockquote>All of those terms had been defined previously and quizzed verbally many times on prior days (GCF = greatest common factor, DOS = difference of two squares, SQ = simple quadratic, i.e., x^2+bx+c). Now obviously, anyone who had missed the prior day or been significantly late (so as to miss one or more of the 3 core procedures) would be at a disadvantage.<br /><br />But it appeared to me that the major roadblock was reading and implementing that direction in part (2): that is, following a logic branching procedure, making a decision on what to do next. For some of the more struggling students, I could stand by their desks and say something like: "Now you have two terms. That's a binomial. What should you try now?", and they either couldn't tell me or pick the wrong procedure. (And then one student could only squint and squint at the board and clearly couldn't read what was written there, I guess ever for the semester or any other class they've taken.) <br /><br />And generally isn't that true for the hardest parts of the basic algebra class? Things like solving general equations (knowing what inverse to apply next for the problem, including exponents or radicals), identifying special products in a multiply exercise (FOIL, DOS, or square of a binomial), or even just following the written directions on any given problem (simplifying vs. solving vs. factoring vs. graphing, etc.). Maybe the weakest students are quasi-okay following directions for a few steps with straight-line flow of control (what do you call that?), but are unable to deal with any conditional branches or <i>decisions</i> along the way. (And of course this would be similar to the other great brick-wall of basic academics, namely computer programming.)<br /><br />I was talking to colleague recently and said, "I really wish someone had taught our students basic logic at some point". And his response was, "Oh, logic is a very deep subject that is very difficult, I'm not sure how endless truth tables would help". (Oops, I didn't realize that he was a logician by research area.) But I responded: "All I'm looking for is that students can parse an And-Or-Not statement or an If-Else. Like if I write 'If the base is negative, then any odd power results in a negative', many students will make <i>all</i> odd powers negative at the end, by simply ignoring the first part of that statement." And he said, "Oh yes, I've been having the same problem in my classes lately..."<br /><br />Is this a key part of our problem for students attempting to enter college for the first time, at the level of either algebra or computer programming? That they simply can't make branching <i>decisions</i> when required? (Personally, one change I'll make the future is to write my process as "If binomial try DOS..." so the decision is explicitly before the action, but I know from even a statistics course that I teach that many students still can't follow such a direction.) Is this intrinsic to the student, or is it evidence of high school academics that demanded mindlessness when following directions?<br /><br /><br />http://www.angrymath.com/2013/11/branching-decisions-in-algebra.htmlnoreply@blogger.com (Delta)6tag:blogger.com,1999:blog-7718462793516968883.post-3721162908281342761Mon, 28 Oct 2013 09:00:00 +00002013-10-28T05:00:13.409-04:00Keep Change ChangeHere's another one of these stupid memory devices that I guess some pre-algebra instructors use to get their students to hobble through their class, but then put them on the wrong path later on. It's a reminder specifically for how to subtract a negative number: +9-(-4) = +9+(+4) = 13, or -3-(-6) = -3+(+6) = 3, stuff like that. The "keep change change" mnemonic supposedly gets them to cancel the two juxtaposed negatives (and not the one in the first term).<br /><br />But like <a href="http://www.angrymath.com/2009/03/pemdas-terminate-with-extreme-prejudice.html">PEMDAS</a>, this sets up a terrible habit, and masks the real meaning to the writing. The actual story is that a negative functions like multiplication, and flows left-to-right the same as we read in English. Yes, students in algebra are routinely stumbling over negatives in general and the subtraction most of all. But when I try to clarify it, usually some student now goes "oh, it's keep-change-change". Then I ask them to simplify an expression with three or more terms in it, like +9-(-4)-(+3), and at that point they have no idea what to do. They don't see that juxtaposed negatives are cancelling out, just like a multiply. The mnemonic that get them through pre-algebra with only two terms at a time was a waste, and has set them up for failure later on. <br /><br />I've only heard this brought up by students in the last 4 years or so (not before that). Initially I suspected that the mnemonic was specific to where I teach, because the initials happen to be the same as our school. But when I do an online search it does show up in a small number of hits elsewhere -- well: actually just once at <a href="http://www.algebra-class.com/subtracting-integers.html">algebra-class.com</a> and then once as an answer to a Yahoo question (possibly those two items might be written by someone that went to our school?). <br /><br />So my question: Have you ever heard of this "keep change change" nonsense anywhere else? Did you ever hear it before, say 2008?<br /><br /><br />http://www.angrymath.com/2013/10/keep-change-change.htmlnoreply@blogger.com (Delta)1tag:blogger.com,1999:blog-7718462793516968883.post-5868698143216044938Mon, 21 Oct 2013 09:00:00 +00002013-10-21T05:00:10.244-04:00Are Parentheses Multiplication?Are parentheses multiplication? My remedial algebra students will pretty universally answer "yes" to this question; I guess they must be taught that explicitly in other courses. I'm pretty damned sure that the answer is "no", and I try to pound it out of them on the first day of the class.<br /><br />Even professional researchers exploring common mistakes in algebra education are prone to saying "yes" to this question, for example:<br /><blockquote class="tr_bq"><i>Misconceptions: Bracket Usage -- Beginning algebra students tend to be unaware that brackets can be used to symbolize the grouping of two terms (in an additive situation) and as a multiplicative operator </i>[Welder, "Using Common Student Misconceptions in Algebra to Improve Algebra Preparation", slide 7; references Linchevski, 1995;<a href="http://www.rachaelwelder.com/research/Elementary_Teachers_Mathematical_Knowledge_for_Teaching_Prerequisite_Algebra_Concepts.html"> link</a>]</blockquote>But are parentheses a multiplicative operator? It seems clear that the answer is "no". Now clearly all of the following are multiplications of <i>a</i> and <i>b</i>: <i>ab, (a)b, a(b), (a)(b)</i>, etc. But notice that the parentheses make no difference at all in this piece of writing. These are multiplications because of the usage of <b><i>juxtaposition</i></b>; any two symbols next to each other, barring some other operator, are connected by multiplication. Obviously, if there were some <i>other</i> written operator like + - / ^, between the <i>a</i> and <i>b</i> it would be something different; but granted that multiplying is probably the most common operation, we read the <i>absence</i> of a written operator to indicate multiplication. <br /><br />The chief problem with telling students that parentheses indicate multiplying is that they then routinely get the order-of-operations incorrect. Assuming a standard ordering ([1] inside parentheses, [2] exponents & radicals, [3] multiply & divide, [4] add & subtract), students want to perform multiplying with any factors in parentheses in the first step, before exponents. One of the first and frequently repeated side-questions I ask in my class is, in an exercise like "Evaluate 2+3(5)^2" -- "Yes or no, is there any work to do inside parentheses?" On the first day of algebra, almost the entire class will answer "yes" to this (and want to do a multiply), at which point I explain that the answer is actually "no". If there is no simplifying <i>inside the parentheses</i>, then the first piece of actual work will be to apply the exponent operation. And that's all that parentheses mean. (There is of course a multiplication here -- not because of the parentheses, but because of the juxtaposed 3, and it must take place after the exponent operator.) A majority of the class will pick up on this afterward, but not all -- some proportion of a class will continue to say "yes" and be confused by this particular question throughout the semester. (As another example, some students are prone to evaluate something like "(5)-2 = -10" for this and other reasons.)<br /><br /><br />Whether a factor is juxtaposed next to something in parentheses or not is irrelevant to the multiplication; parentheses are a separate and distinct issue. What say you? Have you ever said that the parentheses symbols actually mean multiplying?<br /><br /><br />http://www.angrymath.com/2013/10/are-parentheses-multiplication.htmlnoreply@blogger.com (Delta)2tag:blogger.com,1999:blog-7718462793516968883.post-7632672662661793176Wed, 16 Oct 2013 09:00:00 +00002013-10-16T05:00:09.243-04:00Quaternion Anniversary170 years ago today, Sir William Rowan Hamilton had the flash of insight on how to extend two-dimensional complex numbers to cover 3- and 4-dimensional space, in the form of <a href="http://en.wikipedia.org/wiki/Quaternion">quaternions</a> -- in particular the rather sticky problem of how to make their multiplication work reasonably. This occurred while he was walking with his wife along a canal, to an academy meeting in Dublin, Ireland. And to insure that he didn't forget the insight, he famously took a knife and wrote the formula into the stone of Brougham Bridge as he walked underneath it. <br /><br />This summer I had the good fortune to visit Dublin, and my partner and I took an afternoon to make the hike and find the plaque commemorating Hamilton's discovery. (It's about a 3-hour round-trip walk outside the city, beside the utterly enchanting Royal Canal. Quicker if you have a car, of course, but we do everything on foot.) Eureka!<br /><br /><div class="separator" style="clear: both; text-align: center;"><a href="http://2.bp.blogspot.com/-fR2zS3D4Ceo/UkdhnYw93_I/AAAAAAAACc0/Wdc1RS4WHds/s1600/DSCN5105.JPG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="http://2.bp.blogspot.com/-fR2zS3D4Ceo/UkdhnYw93_I/AAAAAAAACc0/Wdc1RS4WHds/s320/DSCN5105.JPG" width="240" /></a></div><br /><div class="separator" style="clear: both; text-align: center;"><a href="http://1.bp.blogspot.com/-pkYFDQLVAUc/Ukdhxc5GQwI/AAAAAAAACc8/qYO2azDlW5E/s1600/DSCN5103.JPG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="http://1.bp.blogspot.com/-pkYFDQLVAUc/Ukdhxc5GQwI/AAAAAAAACc8/qYO2azDlW5E/s320/DSCN5103.JPG" width="240" /></a></div><br />http://www.angrymath.com/2013/10/quaternion-anniversary.htmlnoreply@blogger.com (Delta)2tag:blogger.com,1999:blog-7718462793516968883.post-4080678549526008837Mon, 07 Oct 2013 09:00:00 +00002013-10-07T05:00:01.372-04:00You Are Now Entering a Region With a Logarithmic Scale<div class="separator" style="clear: both; text-align: center;"><a href="http://4.bp.blogspot.com/-zEX0z-cpay0/UkdYxMbanlI/AAAAAAAACck/Af85GoGWATY/s1600/Image01121980015621.JPG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="240" src="http://4.bp.blogspot.com/-zEX0z-cpay0/UkdYxMbanlI/AAAAAAAACck/Af85GoGWATY/s320/Image01121980015621.JPG" width="320" /></a></div><br />Armagh Observatory Astropark, Northern Ireland, UK.<br /><br /><br />http://www.angrymath.com/2013/10/you-are-now-entering-region-with.htmlnoreply@blogger.com (Delta)0tag:blogger.com,1999:blog-7718462793516968883.post-2600617280332909975Mon, 30 Sep 2013 09:00:00 +00002013-09-30T05:00:02.232-04:00Remedial RecommendationsSo granted that the last blog post here was thinking about all the reasons why remedial college math classes in algebra are so tough (for students and teachers), I'm pleased to say that 3 weeks into this almost-all-algebra-remediation semester, things are definitely going the best for me in my decade-long teaching career. Here are some things that I'd say have had a clear, beneficial impact on my current semester:<br /><br /><ol><li><b>Shorter class times.</b> In the prior 8 years at CUNY, I have always had 2-hour long algebra classes, meeting twice per week (partly because I've mostly been part-time, teaching at night). For the first time, my classes are 1 hour long, meeting four times per week. This clearly works better for the endurance and attention available to the students. We're in, focused on one narrow topic, and finished before everyone gets too tired & cranky. This has been a pleasant and great surprise to me; definitely the biggest-impact of the semester. (Not that it would work for night students or part-time teachers, where the travel burden would be inefficient.)</li><li><b>Starter exercise pack.</b> I expect students to have a copy of the textbook and be practicing exercises from it regularly, but very few do so (as noted last time). One problem is that students don't immediately have the textbook in the first week, as they're saving up, looking for a used copy, or having an old edition shipped online (as I explicitly encourage). This gap then sets the habit of them skipping my "practice" advice. What I did this semester is to copy a packet of "starter exercises" from the book, covering the first few weeks, <i>with answers</i>, so I can hand it out the very first day and explicitly point to what they can practice that very night. I've found this to be quite helpful in setting the precedent for regular practice; I've had more students than usual come to class with questions about problems, and this sets up a virtuous cycle of other students seeing it as expected behavior. </li><li><b>Tailored, trickier problems.</b> In the past my routine was to lecture, then turn to the book and practice problems from the text with students. Partly due to the relatively small number of problems in our in-house text, about a year ago I went through the course and wrote custom exercises for every in-class topic. Generally I wrote these to be tougher than standard starting problems, and every single problem from the first integrates common stumbling blocks (negative numbers, one and zero coefficients, etc.). Among the advantages here are that (a) we're not totally boring the students who have seen the material before, (b) we're always dealing with problems similar to test items, and (c) we're spending time "triaging" all the trouble spots. These exercises are working very, very well for me. Textbooks usually start problems sets with very rudimentary "common sense" examples to get started, but granted the limited class time we have available, I would highly recommend skipping those low-level problems and immediately start working with at least mid-level exercises for every topic.</li><li><b>Ending with flex-time.</b> There's probably a better name for this, but what I mean is: I end every class with a few exercises (one word problem or two pure algebra) and say, "This is the last thing we'll do today; show me the answers and you're free to go" (this being maybe 20-30 minutes before the end of the period). Then I circulate and check answers, give corrections or hints, etc. The better students push themselves to finish quickly and happily leave (thereby avoiding bored-irritated-distracted people in the room); the mid-level students get more time for feedback and cleaning up trouble areas (and also with less embarrassment or defensiveness from a roomful of people listening in); and the very weakest student gets some personal one-on-one time with me. I have to remember to give any homework or next-class directions prior to this point, of course. This was a great, semi-accidental find on my part. (And the flex-time mechanism works even better with 1-hour classes, since it happens twice as often as it would for my night classes.) </li><li><b>Surrendering on mobile devices.</b> My remedial students commonly come in with smartphones running and earbuds in both ears throughout the entire class. Considering that my higher-level students practically never do this, in the past I felt it was my responsibility to model proper collegiate discipline and be very hardcore about having people shut off their devices at all times. Frankly, the resistance to this could be so fierce that it blew up into security issues on me a few times. So as stupid as it seems, this semester I've been letting people sit in class using phones and with earbuds in without immediately confronting them (unless they were directly interacting with me at the time). It seems to take some of the pressure off, and in some cases for students who are legitimately already on top of the information, it may reduce the boredom-irritation factor. On the one hand, it's dumb as all hell, but on the other hand I don't really have the tools to fix that problem on top of everything else.</li><li><b>Entering with a sense of joy.</b> Not really new, but I try to remember to come into class with an upbeat attitude and thinking about how great it is to share the topic of the day with whomever's willing to listen. Obviously from the name of this blog you can tell that's not actually my most natural personality. But if I can, I try to shake as much crankiness off before stepping into the room. As the simply amazing film <a href="https://en.wikipedia.org/wiki/Monsieur_Lazhar"><i>Monsieur Lazhar</i></a> put it, "A classroom is no place for despair". That does seem to make things run more productively and with less general combativeness than some times in the past.</li></ol><br />Do you have any tactics and strategies that work particularly well in the context of remedial college classes?<br /><br /><br />http://www.angrymath.com/2013/09/remedial-recommendations.htmlnoreply@blogger.com (Delta)6tag:blogger.com,1999:blog-7718462793516968883.post-8663109715895079742Mon, 09 Sep 2013 09:00:00 +00002013-09-19T01:30:29.783-04:00Reasons Remedial is RoughToday is the start of my fall semester at CUNY, and my schedule is almost entirely teaching remedial algebra courses. (You know, the toughest course in the curriculum, that generally less than half students anywhere pass.) So as I think about introducing myself to my students this week, and trying to earn their trust that what I'm asking them to do is truly necessary and worthwhile, one question that sometimes pops up is, "Why do so many students fail at remedial algebra?" <br /><br />The answer is that there's lots of reasons, and usually more than one for any given student. The philosopher Michel Foucault would call this state being "overdetermined" -- there's no single root cause we can ferret out that would fix everything. Without consulting hard data sources, here's a list of the top reasons that I see from my personal experience:<br /><ol><li><b>Lack of math skills from high school.</b> Many students simply don't have the requisite skills from high school, or really junior high school (algebra), or in many cases even <i>elementary </i>school (times tables, long division, estimations, converting decimals to percent, etc.). This deep level of deficit is like sand in the engine when trying to learn new math.</li><li><b>Lack of language skills from high school.</b> What's dawned on me in the last year or so, in the context of applied word problems, is that many students may actually be worse at English than they are at the basic math. Grammar isn't taught anymore, so students can't parse a sentence in detail, can't identify the noun or verb in a sentence, and so forth. This cripples learning the structure of any new language, algebra included.</li><li><b>Lack of logic skills from high school.</b> No one teaches basic logic, so students can't automatically parse If/Then, And, Or, Not statements, which form critical parts of our mathematical presentations and procedures.</li><li><b>Lack of study skills or discipline.</b> Almost none of my students do any of the expected homework from our textbook. (On the one hand, I don't collect or award points for homework, so you might say this is unsurprising; but my judgement is that the amount of practice students need greatly exceeds the amount of time I have to mark or assess it.)</li><li><b>Lack of time to study.</b> Certainly most of our community college students are holding jobs, or caring for children, or supporting parents or other family members. The financial aid system actually requires a full-time course load for benefits; combine that with a full-time job -- really, the equivalent of two 40-hour jobs at once -- and you get a very, very challenging situation. (Side note: In our lowest-level arithmetic classes, I find that work hours are positively correlated with success, but not so in algebra or other classes.)</li><li><b>Untreated learning disabilities. </b>This would include things like dyslexia, dyscalculia, ADD, etc. All I can do is speculate as to what proportion of remedial students would exhibit such problems if we instituted comprehensive screening. But I suspect that it's quite high. When students are routinely mixing or dropping written symbols, then disaster will result. Unlike other languages, concise math syntax has no redundancies to enable the "you know what I mean" safety net.</li><li><b>Emotional problems or contempt for the class.</b> I put this last, because it's probably the least common item in my list -- but common enough that it shows up in one or two students in any remedial classroom; and a single such student can irrevocably damage the learning environment for the whole class. Some students who actually know <i>some</i> algebra start the course thinking that it's beneath them, and become regularly combative over anything I ask them to do, sabotaging their own learning and that of others. It's pretty self-destructive, and the pass rate for these kinds of "know-it-all" students seems to be about 50/50. </li></ol>If you've taught similar courses, does that line up with your experiences? Have I left anything obvious out of the list?<br /><br /><br />http://www.angrymath.com/2013/09/reasons-remedial-is-rough.htmlnoreply@blogger.com (Delta)7tag:blogger.com,1999:blog-7718462793516968883.post-3543608861724836148Mon, 05 Aug 2013 09:00:00 +00002013-08-05T05:00:04.893-04:00Remedial Math at CUNY (NYTimes, 2011)Here's a clear-eyed and concise article from the <i>New York Times</i> back in 2011, "CUNY Adjusts Amid Tide of Remedial Students", regarding remedial math classes at CUNY (where I work), mostly focusing on LaGuardia Community College (a different school than my own). Similar information to stuff we know from elsewhere, but I didn't note it at the time, and I wanted to document it here. Some highlights: <br /><ul><li>Nationally, about 65% of incoming community college students need some form of remedial education (2:1 ratio of math to reading). </li><li>At CUNY, about 75% of students need some remediation.</li><li>In NYS, fewer than 50% of graduating high school students are ready for college or careers.</li><li>In NYC, the proportion of prepared high school graduates is only 23%.</li><li>At LaGuardia, 40% of all math classes taught are remedial. </li><li>Cost of remediation at CUNY doubled in the last 10 years to $33 million.</li><li>About 25% of CUNY community college freshman graduate with a degree after 6 years. (Nationwide it's about 35%.)</li></ul><br /><div style="text-align: center;"><a href="http://www.nytimes.com/2011/03/04/nyregion/04remedial.html">http://www.nytimes.com/2011/03/04/nyregion/04remedial.html</a></div><br /><br />http://www.angrymath.com/2013/08/remedial-math-at-cuny-nytimes-2011.htmlnoreply@blogger.com (Delta)1tag:blogger.com,1999:blog-7718462793516968883.post-4211975094169589423Mon, 22 Jul 2013 09:00:00 +00002013-07-22T14:55:09.385-04:00San Jose State Suspends Udacity ExperimentNews this weekend that San Jose State in California has suspended its experiment with Udacity offering low-level courses for pay and college credit and requirements:<br /><br /><a href="http://www.latimes.com/news/local/la-me-0719-san-jose-online-20130719,0,4160941.story">http://www.latimes.com/news/local/la-me-0719-san-jose-online-20130719,0,4160941.story</a><br /><br />Key point: "Initial findings suggest that students in Udacity courses performed poorly compared with students in traditional classes." Note that this is broadly in line with the prediction I made here several weeks ago, in the post titled <a href="http://www.angrymath.com/2013/06/online-remedial-courses-considered.html">Online Remedial Courses Considered Harmful</a>, something that I considered to be a fairly easy and obvious call. I asked the question, "We'll see how quickly MOOCs such as UDacity, and those partnering, paying, and linking their reputation with them, re-learn this lesson", and I'd have to say that this turnaround was faster than I would have guessed at San Jose State. Perhaps they will agree with the earlier experiment at the Philadelphia school where it was concluded, "The failure rates were so high that it seemed almost unethical to offer the option" (see link to my earlier post above). <br /><br />The last paragraph of today's news story reiterates my own views, which I've written about here on numerous occasions: "Educators elsewhere have said the purely online courses aren't a good fit for remedial students who may lack the self-discipline, motivation and even technical savvy to pass the classes. They say these students may benefit from more one-on-one attention from instructors."<br /><br /><br />A few other points: "Preliminary results from a spring pilot project found student pass rates of 20% to 44% in remedial math, college-level algebra and elementary statistics courses." Now, it would be much better if this success rate were broken down individually for each of these several classes. I might guess that the 20% success rate is specifically for the remedial math course? That does seem marginally lower than most remedial courses where the success rate seems to be around one-quarter or one-third.<br /><br />Also, the article says, "In a somewhat more promising outcome, 83% of students completed the classes." This seems unsurprising, given that students are paying $150 out-of-pocket for the course. This completion (but mostly failing) rate is about in line with the remedial courses that I teach, where students are similarly paying, meeting an absolute requirement by the college, and have no real academic penalty for failing (the course grade does not affect GPA, for example). <br /><br />Perhaps charitably we might say that the $150 expense level is lower than standard college teaching costs, and perhaps someone might think it's a reasonable return on investment, even granted a lower success rate (although maybe not when accounting for student time spent). And we might also be suspicious of (a) whether this is the actual Udacity expense, or if they're operating at a loss to establish the market, and (b) the quality of the assessment at the end, when there's a clear incentive to make it easy to pass and the Udacity statistics final I've seen in the past was <a href="http://www.angrymath.com/2012/09/udacity-statistics-101.html">almost comically trivial</a>. <br /><br />Supposedly this suspension is for re-tooling and analysis of possible improvements. "The courses will be offered again next spring, [San Jose State Provost Ellen Junn] said." We shall see.<br /><br /><br />http://www.angrymath.com/2013/07/san-jose-state-suspends-udacity.htmlnoreply@blogger.com (Delta)2tag:blogger.com,1999:blog-7718462793516968883.post-9080444144600775655Mon, 15 Jul 2013 09:00:00 +00002013-07-20T01:48:56.461-04:00Proof of Approximating Radicals to the Closest IntegerThe in-house textbook that my college uses for basic algebra classes does an interesting thing -- as part of the introduction to radicals, it goes through approximating a whole-number radical by comparing it to the nearest perfect squares. An example from the book:<br /><blockquote class="tr_bq"><i>Example 2: √3000 is closest to which integer?</i><br /><br /><i>Solution:... [after some preliminary estimates] Try between 50 and 60, (55)<sup>2</sup> = 3025, still a bit too high. Try (54)<sup>2</sup> = 2916, now a little too low. Thus √3000 is between 54 and 55, but closer to 55 since 3025 is closer to 3000 than is 2916.</i></blockquote>So I think we all agree that in a case like this, the radical is clearly <i>between</i> the two integers indicated (since the radical function is monotonic). But the additional step of saying which of the two it's closer to is not done in all textbooks. <a href="http://www.lemars.k12.ia.us/webfiles/mboyd/Pre-Algebra%20Textbook%20%28e-edition%29/Source/J9C09AAD.pdf">Here's another example</a> (not our school's textbook). Let's clearly state the claim being made here: <br /><blockquote class="tr_bq"><i>Claim: If x is closest to n<sup>2</sup>, then </i><i>√x is closest to n. </i></blockquote>Above, "closest" means the minimum distance from x to any n ∈ ℕ. This claim gave me a squirrelly feeling for some time, and with good reason; it <i>isn't</i> true for arbitrary x ∈ ℝ.<br /><blockquote class="tr_bq"><i>Counter-example: Consider x = 12.4. It's closest to the perfect square 3<sup>2</sup> (distance 3.4 from 9, versus 3.6 from 16). But the square root is actually closest to the integer 4 (</i><i>√12.4 ≈ 3.52). </i></blockquote>Now, let's characterize the kinds of numbers for which the claim in question <i>won't</i> work. For some integer n, take the cutoff between it and its successor, n+1/2 (i.e., the average of n and n+1). Any <i>√</i>x below this value is closer to n, while any <i>√</i>x above it is closer to n+1. Under the squaring operation, this cutoff gets mapped to the square-of-the-average (n+1/2)<sup>2</sup> = n<sup>2</sup>+n+1/4. <br /><br />On the other hand, consider the cutoff between the squares of the integers in question. Any x below their average is closer to n<sup>2</sup>, while any x above the average is closer to (n+1)<sup>2</sup>. This average-of-the-squares is ((n)<sup>2</sup>+(n+1)<sup>2</sup>)/2 = (n<sup>2</sup>+n<sup>2</sup>+2n+1)/2 = (2n<sup>2</sup>+2n+1)/2 = n<sup>2</sup>+n+1/2. <br /><br />So you can see that there's a gap between these two cutoffs, and in fact it's exactly 1/4 in all cases, no matter what the value of n. If you pick x in the range n<sup>2</sup>+n+1/4 < x < n<sup>2</sup>+n+1/2, then <i>√</i>x will be closer to its ceiling of n+1, but x itself will be closer to its floor-square of n<sup>2</sup>. Specifically, the problem cases for x are anything a bit more than the product of two consecutive integers (also called a <a href="http://en.wikipedia.org/wiki/Pronic_number">pronic</a> or <a href="http://oeis.org/A002378">oblong</a> number), exceeding n(n+1) = n<sup>2</sup>+n by a value of between 1/4 and 1/2. Since n<sup>2</sup>+n is itself an integer (ℕ closed under add/multiply), we see that any x in violation of the claim must be strictly between two consecutive integers, and thus cannot itself be in ℕ.<br /><br /><div class="separator" style="clear: both; text-align: center;"><a href="http://1.bp.blogspot.com/-HK-AA2XaPts/Ud-ZNDPLhqI/AAAAAAAACFw/kl3jNNIPSAU/s1600/ApproxRadicals.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="90" src="http://1.bp.blogspot.com/-HK-AA2XaPts/Ud-ZNDPLhqI/AAAAAAAACFw/kl3jNNIPSAU/s400/ApproxRadicals.gif" width="400" /></a></div><br />In conclusion: While the claim in question is <i>not</i> true for all real numbers, it is a trick that does happen to work for all whole-numbered values of x. How important is that? Personally, I'm pretty uncomfortable with giving our students an unverified procedure which can leave them thinking that it works for any number under a radical, when in fact that's not the case at all.<br /><br />http://www.angrymath.com/2013/07/proof-of-approximating-radicals-to.htmlnoreply@blogger.com (Delta)8tag:blogger.com,1999:blog-7718462793516968883.post-6022188045290460761Mon, 08 Jul 2013 09:00:00 +00002013-07-09T12:38:52.659-04:00Why Z-Scores Have Mean 0, Standard Deviation 1This article is aimed at introductory statistics students.<br /><br />Statistics, as I often say, is a "space age" branch of math --many of the key procedures like student's t-distribution weren't developed until the 20th century (and thus helped launch the revolution in science, technology, and medicine). While statistics are really critical to understanding modern society, it's somewhat unfortunate that they're built on a very high edifice of prior math work -- in the introductory stats class we're constantly "stealing" some ideas from calculus, trigonometry, measure theory, etc., without being totally explicit about it (the students having neither the time not background to understand them).<br /><br />One of the first areas where this pops up in my classes is the notion of z-scores: taking a data set and standardizing by means of z = (x-μ)/σ. The whole point of this, of course, is to convert the data set to a new one with mean zero and standard deviation (stdev) one -- but again, unfortunately, the majority of my students have neither the knowledge of linear transformations nor algebraic proofs to see why this is the case. Our textbook has a numerical example, but in the interest of time, my students just wind up taking this on faith (bolstered, I hope, by a single graphical check-in). <br /><br />Well, for the first time in almost a decade of teaching this class at my current college, I had a student come into my office this week and express discomfort with the fact that he didn't fully understand <i>why</i> that was the case, and if we'd really properly established that fact. Of course, I'd say this is the very best question that a student could ask at this juncture, and really gets at the heart of confirmation that should be central to any math class. (Interesting tidbit -- the student in question is a History major, not part of any STEM or medical/biology program required to take the class.)<br /><br />So I hunted around online for a couple minutes for an explanation, but I couldn't find anything really pitched at the expected level of my students (requirements: a <i>fully</i> worked out numerical example, graphical illustration without having heard of shift/stretches before, algebraic proof without first knowing that summations distribute across terms, etc.) Instead, I took some time the next day and wrote up an article myself to send to the student, which you can see linked below. Hopefully this very careful and detailed treatment helps in some other cases when the question pops up again:<br /><br /><div style="text-align: center;"><b><span style="font-size: large;"><a href="http://www.superdan.net/download/blog/angrymath/MeanAndStdevOfZ-Scores.pdf">MeanAndStdevOfZ-Scores.pdf (106 KB)</a></span></b></div><br /><br />http://www.angrymath.com/2013/07/why-z-scores-have-mean-0-standard.htmlnoreply@blogger.com (Delta)4tag:blogger.com,1999:blog-7718462793516968883.post-1884116567266148051Mon, 01 Jul 2013 09:00:00 +00002013-07-01T05:00:05.229-04:00Institutionalized Score ManglingFor some reason, there's been a bunch of stories of schools secretly boosting near-failing grades recently. A few that come to mind:<br /><br /><ol><li>Just this weekend --<a href="http://www.newsday.com/long-island/towns/hempstead-schools-change-failing-grades-1.5591109"> Hempstead High School on Long Island</a> (somewhat near me) has a scandal of regular boosting failing 63 and 64 scores to passing 65's in any class from grades 6-12. Apparently this has been done for some number of decades, and the Deputy Superintendent defends it as customary at their school and others (although it was done in secret and not any documented policy). Other schools nearby deny that they engage in the same practice.</li><li>Early last month, an <a href="http://hackaday.com/2013/06/05/hacking-high-school-exams-and-foiling-them-with-statistics/">Indian student attending Cornell University</a> accessed and mined the data from the Indian national high school exams from the last year, and found that the scores being reported were very clearly manipulated in some secret way, as there were irregular gaps in the achieved scores across all subject areas. In particular -- none of the scores 32, 33, or 34 were achieved by any student for any subject in the entire country, whereas 35 is the minimum to pass.</li><li>Less publicized (but perhaps more dramatic) is the fact that <a href="http://www.forbes.com/sites/jamesmarshallcrotty/2013/05/08/are-new-york-city-students-getting-smarter-or-are-regents-exams-getting-easier/">New York State Regents Examinations </a>are in some sense getting easier, as the high school system brags about increased graduation rates at the same time as their graduates needing remedial instruction in college reaches around 80%. Someone who really ought to know told me that the scores on the exams are effectively mangled by administrators in Albany, i.e., a 45% raw performance is reported as a passing scaled score of "70" and so forth.</li></ol><br />All of this certainly seems really bad to me in a first-pass "smell test" of credibility. It just seems like any kind of secret score-mangling is a foul wind that carries with it lack of transparency, disbelief in results, corruption, etc. Interestingly, a great many commentators at <a href="http://yro.slashdot.org/story/13/06/06/1338217/hacker-exposes-evidence-of-widespread-grade-tampering-in-india">Slashdot</a> (around the Indian story) said things like "this is done everywhere, if you don't understand it then you don't know anything about teaching", which is false in my experience. But apparently the motivation is frequently to avoid conflict and time spent around complaints over barely-failing scores. Some other institutional strategies I've seen or heard about to deal with this issue:<br /><ul><li>Those who miss passing by 5% get to immediately take a re-test. I haven't seen this, but I've heard it said of other universities.</li><li>Those who miss passing by 5% get a one-week refresher seminar, and can then re-test on the final. A somewhat more subtle version of the preceding which is used where I teach at CUNY for math remediation.</li><li>Keeping both scores and the passing criteria itself secret -- reporting only pass-or-fail results for the test. This was done in the past at my college, allegedly to forestall complaints over scores. It's pretty much my least favorite option, because it just made everyone involved confused and upset over the secret criteria and unknown scores.</li></ul><br /><br />Now, I'm always in favor of maximal transparency, honesty, and confidence in any kind of process like this. But in some cases I've found myself to be a lone voice for this principle. Is this kind of secret score-mangling an acceptable social massaging of high-stakes testing, or is it the harbinger of corruption and non-confidence in our institutions? Do we even have any choice in the matter anymore, as educators or citizens?<br /><br />http://www.angrymath.com/2013/07/institutionalized-score-mangling.htmlnoreply@blogger.com (Delta)8