The dip in the pass rate was clearly caused by extra ‘promoted’ pupils and would have been even greater had it not been for manipulation of the marks, writes Nic Spaull
Like so many things in South Africa, the matric results released this week are a paradox of good policies badly implemented. This time it was the minister's bold "promotion policy" that led to an extra 21% more pupils writing matric (644,536 in 2015 compared with 532,860 the year before). The policy limits the number of times pupils can repeat a grade to once every three years and means that fewer drop out, being "promoted" instead.
While her decisive action has led to increased efficiency and access, it has inadvertently caused a huge crack in the matric standardisation process, one only now becoming apparent. That the Department of Basic Education did not properly identify all progressed pupils, and that exams monitoring body Umalusi did not (perhaps could not) take account of all such pupils in its standardisation process, calls into question the massive upward adjustments in marks that took place behind the scenes.
As usual, some commentators myopically focused on the drop in the pass rate, from 76% (2014) to 71% (2015), as if this were a meaningful indication of anything. It isn't. Or that it signalled a decline in quality, or harder exams. It doesn't. Yes, the pass rate went down but the number of pupils passing went up. The real question might not be why the pass rate dropped, but why it didn't drop further.
In comparing the media statement from Umalusi and the report from the department, the answer is clear. The decision was made to raise raw marks across the board, from maths and physical science to life science, maths literacy, history, accounting, geography and 24 other subjects. Umalusi emphasises that this was an "unprecedented set of adjustments". When the maths literacy pass rate is adjusted from 38% to the (publicly reported) 71%, this is certainly unprecedented and, I would argue, unwarranted.
Was the test really so much more difficult than previous years? (This is the only reason why one is allowed to adjust the marks at all.) Why did the internal and external moderators not pick up the huge increase in difficulty? Is it not more plausible that the massive drop in pre-adjusted performance was actually due to the additional 112,000 weaker pupils who would have otherwise dropped out? If so, Umalusi shouldn't have adjusted.
This is not to say that Basic Education Minister Angie Motshekga was wrong in introducing the promotion policy. Quite the opposite; she was heeding local and international research, which shows that excessive repetition is costly, inefficient and has no educational benefit to the pupil.
Yes, we do need to find ways of preventing and remediating the problem, but rooting out wasteful repetition in the meantime is prudent and wise. A positive effect of this policy and the extra-large class of 2015 meant many more pupils taking and passing key subjects, with about 52000 extra matric passes, 9000 extra maths passes and 15,500 extra bachelor passes.
Umalusi and the department claim there were only 65671 progressed pupils . Yet there were an extra 111676 matrics this year. So where did the other 46005 extra pupils come from? The clear answer is that there was a big policy change preventing schools failing pupils multiple times and encouraging them to promote weak pupils into matric.
Unfortunately, no amount of standardisation can solve the biggest problem in our education system
Second, the way provinces record and report who is a progressed pupil is highly dubious and varies by province and district. So, although we have about 66000 officially progressed pupils, we also have 46000 quasi-progressed pupils (what Umalusi calls "borderline candidates").
The reason why all of this matters is because it influences the decision on whether to adjust the matric results and by how much. Umalusi is only ever meant to adjust marks up or down if it believes the exam was harder or easier than in previous years. The core assumption in this is that the different matric cohorts (2013, 2014 or 2015) are of equal ability. Thus, any differences between the years can only be because the paper was easier or harder. And this is where the crack emerges. There is simply no way the 2015 distribution of 645000 matrics (including progressed and quasi-progressed pupils) is as strong as the distribution of 533,000 pupils in 2014.
Thus the reason the 2015 cohort did so much worse on the raw scores was because of the extra 112000 weaker pupils, not because the tests were harder. We know that Umalusi did not take this into account because there is no way of identifying the 46000 quasi-progressed pupils. In Umalusi's defence, it couldn't have excluded them even if it had wanted to, because provinces didn't record them. But it doesn't seem that Umalusi excluded these 112,000 (or even the 66,000) pupils when it standardised the 2014 and 2015 distributions. This is illogical.
In an unusual change from previous media statements, this year Umalusi included the raw failure rates of subjects (before any adjustments). These can be compared to the marks in the technical report issued by the department. The only difference between the two figures is the Umalusi adjustment, a small change due to school-based assessments and a small compensation for second-language pupils (an extra four percentage points).
When I refer to "raw" and "final-adjusted" pass rates I mean before and after these are accounted for. The three subjects I will focus on are maths literacy, geography and business studies, since all have big increases in enrolment, which suggests these were the subjects taken by the progressed and quasi-progressed pupils. The differences between the raw pass rate and the final-adjusted pass rate are large for geography (increased from 66% to 77%), for business studies (from 54% to 76%) and especially for maths literacy (from a shockingly low 38% to 71%). For a national assessment these are incredibly large adjustments.
This could be justified only if the 2015 exams were extraordinarily more difficult than in 2014. I simply do not buy it. The internal and external moderators all agreed that these exams were set at the appropriate level. To warrant adjustments of this magnitude, they would have had to have been way out in their judgment. Why are we looking for alternative explanations for the big drop in raw marks when this one is staring us in the face? The most logical and obvious reason for the drop is the inclusion of an extra 112,000 weaker pupils.
Paper difficulty is marginal by comparison. In maths literacy alone there were 76,91 extra candidates in 2015. Where did they come from? It is clear that these are the weaker progressed and borderline candidates and that they are the main reason why the raw marks dropped so much. If so, we cannot just adjust the raw marks upwards, as was done last year.
The standardisation process is necessary and probably the best we can do when different papers are written year on year, but Umalusi needs to clarify what happened here and in future be more transparent in its standardisation. Unfortunately, no amount of standardisation can solve the biggest problem in our education system: that most children at the poorest 75% of schools had not learnt to read for meaning by the end of Grade 3 and are forever behind. Indeed, matric starts in Grade 1.