Friday, May 16, 2014

How valid is the phonics screening check?

The Journal of Research in Reading has just published an important and timely paper on the government’s phonics screening check ‘Validity and sensitivity of the phonics screening check: implications for practice’ (Duff, F.J., Mengoni, S. E., Bailey, A.M. and Snowling, M.J.
It asks two ‘critical ‘ questions: First, how well do scores on the screening check ‘correlate with reading skills measured by objective tests’? And, second, ‘is the check sensitive?’, which refers to how sensitive is the check is in detecting children ‘showing early signs of being at-risk of encountering a reading difficulty?’
The study involved eight primary schools in York and included 292 children. Aside from the screening check, an array of other tests was administered. These included school-based assessments, class spelling tests, individualised word reading, comprehension, nonword reading and phonological awareness tests.
So, how valid is the check? Does it measure what it claims to measure? The authors conclude that the check is ‘a highly valid measure of children’s phonics skills’. Moreover, the check ‘showed convergent validity by correlating strongly with other measures of phonics skills and with broader measures of reading’. The latter includes ‘single-word reading accuracy, prose reading accuracy and comprehension’. This should provide strong justification on the part of the government to introduce the check, though as Hans Eysenck once remarked, [i]deological thinking is not easily swayed by factual evidence’*.
The authors also agree with previous studies in concurring that ‘a rigorous assessment of phonics skill is important for early identification of children at risk of reading difficulties’, which they define here as ‘not yet attaining the phonics phase expected by the end of Year1 (i.e., not attaining phase 5)’.
An unexpected boon from the study was that there was ‘a slight tendency to overestimate the prevalence of at-risk readers (as compared with standardised tests of reading accuracy and fluency)’, which the authors, in my view rightly, contend to be ‘a favourable property for a screening instrument’.
Where the authors are more equivocal is around the issue of whether the check is necessary. Although they conclude that it is valid, they also suggest that, where teachers are well trained ‘in the teaching and assessment of phonics, their judgements are sufficient for the purpose’. They go further and add that ‘the use of resources to better equip teachers to conduct ongoing phonic assessments would be more cost-effective, not least because this would place them in the best position to intervene before reading difficulties set in’.
Although it is hard to disagree with the proposition that there wouldn’t be a need for a screening check if teachers were sufficiently well trained to monitor and assess children’s abilities and capabilities in regard to their phonics knowledge and understanding, the authors seem to be ignoring a number of important findings. As Jeanne Chall revealed in her seminal book Learning to Read: The Great Debate (1967), teachers have a strong tendency to be eclectic. They find it very difficult to abandon previous approaches, many elements of which can often be seen to run counter to the principles of a new programme. Neither do they easily relinquish their old pedagogical ideologies unless given training that provides a clear rational for what it is they do and the way in which they do it.
More recently, the NFER report ‘Phonics screening check evaluation: Research report’ (May 2014) bore this out. ‘Even amongst those who are strongly supportive of phonics,’ it reported that there ‘was a firm conviction that other strategies were of equal value and that phonics as a method of teaching reading was most successful when used in conjunction with other techniques’.
As was made clear by a number of respondents to the survey carried out by NFER, there is still a huge amount of confusion, particularly around the areas of decoding and comprehending, in the minds of many teachers. Indeed, one percipient teacher remarked, ‘I think the moment you start to use other methods, you aren’t actually doing synthetic phonics’.
Another rather conspicuous omission from the paper was any comment on the match-funding programme initiated by the government and the appalling disclosure that over 90% of allocated funding had been spent by schools on resources (meaning mainly books) and not on training.
Two things: why is it that so few research articles on the teaching of reading spelling ever quote from or comment on the work of Diane McGuinness? And, why is there such a disconnect between the practitioners out there in the field training teaching practitioners (nearly 12,000 alone in the case of Sounds-Write) and academic researchers in our universities? In case you’re listening out there, there is much better stuff out there than the insipid and meagre diet doled out to many children in the form of Letters and Sounds.

* Quoted from Robert Peal's Progressively Worse: The burden of bad ideas in British schools, p.62.


Dick Schutz said...

While the correlations of the Screening Check with "other tests" is of some interest, the report is a weak investigation of the validity of the measure and the "implications for practice" are based on an invalid hypothetical: "where teachers are well trained ‘in the teaching and assessment of phonics, their judgements are sufficient for the purpose’"

A much sounder study of the validity and reliability of the screening check is the DfE Technical Report published 1 May 2014:

The striking finding of this report is in the comparison of item statistics for Yr 1 students and Yr 2 students who failed the test in Yr 1. (What is termed "facility" is the percentage of students who were able to read the item.) The Yr 2 stats are barely distinguishable from the Yr 1 stats, and the overall Yr 2 distribution is "worse" than the Yr 1 distribution.

It's in the instruction, not in the Check. Teachers who say that they "already knew" what the Check showed are unlikely to do anything different than they were "already doing." And the general consensus that the "phonics" instruction should be "embedded in a rich curriculum. . . " provides statutory and professional justification for their action.

The thing is, all reports to date provide little or no information to identify the "practical implication" of which schools and teachers are "doing it right"--what instruction are the kids receiving.

We do know the limitations of "Letters and Sounds" and the government incentives for materials and training, and we know a good deal about "opinions," but that's not much to go on.

The data for identifying the teachers and schools who are "doing it right" are available. But that's a whole nother story.

Dick Schutz said...

It's sadly ironic that the "research" to date on the Screening Check is less sound than the measure.

I just stumbled across another "study," SCREENING PHONICS IN ENGLAND: A CAUSE FOR CONCERN? Screen Phonics in England" A Cause for Concern? reported in the Open Education Research Journal in December, 2013.

The paper presents useful graphs that dis-aggregate the 2012 national data for "Free School Lunch" students and for three groups of "Special Needs" students. However, given the views the author brings to the graphs, the conclusion drawn: Notwithstanding teething troubles of the first national administration, it is clear that the guidance on the use of this test does not adequately counter the pressure to pass children. This leaves teachers in an unfortunate situation but as it is, it tells us that such early testing is not well integrated into teaching practice and is unlikely to be very accurate.

It's in the "accuracy" of the "teaching practice", not in the "testing."

John said...

Hi Dick,
Sorry to take so long to get back - the problem of looking after aged P, I'm afraid!
Thank you again for your comments and, as always, you get straight to the heart of it! E do indeed need to know what it is that successful schools are doing. There are out there a small number of very good quality phonics programmes. The problem for teachers is knowing which ones, since no-one seems to be willing to spend the time and money on doing the research. Shame on the government for that!
It is enormously irritating that many well-known names in researching phonics teaching are still stuck with materials like Letters and Sounds or some old fashioned traditional phonics programmes when the quality of phonics teaching has improved so much in recent years.
We have a school - St George's Primary School in Wandsworth - half of whose intake is on free school meals, yet they scored 100% on the phonics screening check. Looking at the SATs scores at the end of Key Stage 1 (the end of the first three years of school) it seems to me that there is a pretty close correlation between pupils' reading and spelling scores on standardised tests and their English SATs scores but I don't see anyone looking at this.
Moreover and rather unfortunately, when we train teachers, there's no guarantee they won't status quo or at least continue teaching stuff that runs counter to the principles of good quality phonics teaching. But, as you say, that too is a whole nother story!

Debbie Hepplewhite said...

Hi John and Dick,

I know that John has seen my 'Simple View of Phonics Provision' but I don't know whether Dick has or not.

Anyway, I, too, am concerned about schools selecting 'Letters and Sounds' as their core programme and so I drew up this graphic based on both predictions from some years ago and from observations in real schools and watching video footage via the internet:

This might be of interest - and we certainly know from the NFER survey of teachers' descriptions of their phonics practice and their views of the Year One Phonics Screening Check that, firstly, most schools claim to be 'Letters and Sounds' schools and that many teachers may well be continuing with the flawed and damaging multi-cueing reading strategies.