Menu
Home Page

Junior School Collaboration

Sharing ideas, challenges, solutions
Menu

Latest News

Keep up to date with all the latest news happening at the moment.

  • Reading and the English Curriculum webinar 8th December 2021 with Sarah Hubbard HMI, OFSTED National Lead for English and Professor Teresa Cremin and Claire Williams, Headteacher Bournville Village Primary School

    Thu 04 Nov 2021 Christopher McDonald

    We are pleased to announce a free webinar on the 8th December 2021. The focus of this event will be Reading and the English Curriculum and will feature presentations from Sarah Hubbard, HMI and OFSTED National Lead for English, Professor Teresa Cremin who leads the Open University Reading for Pleasure Project and Claire Williams, Headteacher at Bournville Village Primary School.

    The agenda for the event with working titles is: 

    4pm: Reading and the English Curriculum: Sarah Hubbard HMI

    4:40pm: Reading for Pleasure: Professor Teresa Cremin

    5:10pm: Reading for Pleasure at Bournville Village Primary School

     

    You can register for this event by clicking below: 

    https://us06web.zoom.us/webinar/register/5116361011388/WN_3YyOsxrgSD2BjT-MNub30g

     

  • Headteacher Well Being Virtual Conference

    Mon 25 Oct 2021 Christopher McDonald

    Join us on the 5th November for our first Headteacher Well Being Virtual Conference. The agenda for the day is: 

    8:30am - sign in opens
    9:10am: Welcome - Chris McDonald
    9:15am: Compassionate Leadership + Q and A - Dame Alison Peacock
    9:55am: Reducing Observational Bias + Q and A - Ross Morrison McGill (Teachertoolkit)
    10:35am: Comfort break
    10:45am: "I'm very polite to you, you're very polite to me." + Q and A Barry Smith
    11:30am: Creativity in Reading and Writing+ Q and A - Pie Corbett
    12:15pm - Lunch break and end of session
    12:45pm - OFSTED update with Claire Jones HMI Specialist Advisor: Policy, Quality and Training (Schools and Early Education)
    1:30pm "Reading for pleasure and the reading community" + Q and A - Professor Teressa Cremin
    2.15pm If this was your last year in teaching… + Q and A - Adrian Bethune

     

    You can register at: 

    https://us06web.zoom.us/webinar/register/6416344903326/WN_0pMFtV9sTsW3KdPqzV40qA

  • Curriculum and assessment project with Dr Mick Walker

    Fri 08 Oct 2021 Christopher McDonald

    We are pleased to announce that a number of schools from JUSCO as well as schools who are not part of JUSCO are working with Doctor Mick Walker on a Curriculum and Assessment project. 

    The project aims to identify end of year / end of key stage "End points" / "Batons" / "Key Performance Indicators" for each curriculum subject. 

    Two training sessions have been arranged with Mick and these will take place by Zoom. If you are interested in being part of the project please complete the following registration form. There is no charge for being part of this project: 

    Registration link:

  • Curriculum and Assessment Think Piece with Tim Oates CBE and Mick Walker "The art of questioning"

    Mon 24 Jun 2019 Chris McDonald

    We are happy to share our latest Curriculum and Assessment Think Piece with Tim Oates CBE and Mick Walker. In this think piece Tim and Mick consider the art of questioning. 

     

    Go to Think Piece

  • OFSTED Inspector training Autumn 2018 - Assessment reflections

    Wed 21 Nov 2018 Chris McDonald

    JUSCO was recently invited to the OFSTED inspector training session in Manchester. This post is about the session devoted to assessment. The session had three aims:

    • To ensure inspectors understand how they can evaluate the purpose of assessment.

    • To highlight the inferences inspectors that inspectors can draw from assessment data provided by schools.

    • To provide further support to inspectors in reaching valid and well - evidenced judgements on pupil’s progress.

     

    It is worth noting that prior to the session a pre reading pack was shared and Professor Daniel Koretz’s book “Measuring up - what education testing really tells us” was listed, chapter two “What is a test?” was highlighted for inspectors to read.

     

    What is a test - chapter 2 in a nutshell:

    If you get the chance, do read Daniel’s book as it really is worth it and I fully accept that I don’t do it justice here.

     

    Daniel starts the chapter by discussing a 2004 Zogby poll of voting intentions in the race for the Whitehouse between George W Bush and John Kerry. He notes that the 1018 likely voters surveyed indicated a 4 percentage point lead to George W Bush and that this was reasonably accurate as the final result gave Bush a 2.5% margin.

     

    He goes on to point out that it wouldn’t be possible to measure the voting intentions of 121, 000, 000 people and so pollsters instead use the sample as a proxy measure to make a prediction. The ability to do this depends on the design of the sample, the wording of the question and the willingness / ability of respondents to provide answers.

     

    Koretz points out that educational achievement tests are much the same and that they are proxies for a better and more comprehensive measure that it would not be possible to obtain in a test (Without the test lasting many days and having a very great number of questions!)

     

    A test can only sample for the broad range of knowledge a student has of a particular subject (Domain) and so the accuracy of a test will depend on the careful sampling of content and skills. It will also depend on the wording of the question. In the OFSTED training an example similar to the one below was given:

     

    What inference would you draw from data that revealed that 85% of the class answered this question incorrectly?

     

    Bob has six sweets. Bob perambulates towards the nearest purveyor of fine confectionery and proceeds to purchase another half dozen sugary delicacies. After debating about whether to consume them himself, he decides generously, to apportion his sweets equally between himself and his playmate, Alice. How many sweets is Bob left with?



     

    From the above example it is clear that mastery of the mathematical principles is unlikely to be the barrier that prevented 85% of a class answering this question correctly!

     

    Koretz points out that in addition to the wording we also need to consider the motivation of the test takers to do well and also the behaviour of others, in particular teachers. He notes that “If there are problems with any of these aspects of testing, the results for the small sample of behaviour that constitute the test will provide misleading estimates of a students’ mastery of the larger domain” (D. Koretz - Measuring Up - p. 21) He goes on to say that “A failure to grasp this principle is at the root  of widespread misunderstandings of test scores” “And it has also resulted in uncountable instances of bad test preparation by teachers and others, in which instruction is focused on the small sample actually tested rather than the broader set of skills the mastery of which the test is supposed to signal” (D. Koretz - Measuring Up - p. 22)

     

    It was interesting to see the following quote appear in the conference notebook:  

    “Individual students’ scores on individual test may be thought as “...an inadequate report of an inaccurate judgement by a biased and variable judge of the the extent to which a student has attained an undefined level of mastery of an unknown proportion of an indefinite amount of material” Paul Dressel, 1957

     

    The message that test data should be treated with caution or as Koretz puts it “... don’t treat “her score on the test” as a synonym for “What she has learned” (D. Koretz - Measuring Up - p. 10) was perhaps illustrated with an activity similar to the one below:

     

    In a meeting with school leaders, you are presented with a vast amount of in-school assessment information. It shows that pupils are doing much better than published data. You are not familiar with the methodology and have no way of knowing whether it is valid.

    • What are the potential issues with such data?

    • What would you do next?

     

    The point of the task was to encourage inspectors to ask questions beyond the data presented. On the table I was sitting suggestions for finding out more included book scrutiny, pupil conversations and also asking what, in leader’s opinion, had happened that would account for the improvement.

     

    Glossary as provided to inspectors Autumn 2018

     

    Components: Are the building blocks that have been identified as the most useful for subsequent learning.

     

    Composite: Is the complex activity / skill /performance the components will combine to achieve.

     

    Construct underrepresentation: occurs when a test fails to test all of the intended target area construct. A Year 9 end of year English exam may only include some comprehension questions and a creative writing task, whereas the ‘big idea’ about which inferences will be drawn is ‘English’. With other aspects of ‘English’ such as speaking and listening, writing non-fiction and so on, the construct - ‘English’ is under represented. In reality we would be unable to create a single test that assesses ‘English’ without construct underrepresentation - it would be too big an cumbersome. So, we have to be precise about what aspects of the big idea are being assessed, and then ensure that, over time, we sample from these areas in order to build up as reliable picture of what a child knows, can do and understands, as is possible.

     

    Construct- irrelevant variance: sometimes occurs because tests are just too big and try to do too many things. You end up with a partial assessment of one thing and a partial assessment of other things, leading to inferences that become very confused and, ultimately, meaningless.

     

    Reliability: consistency from one measurement to the next. For example, bathroom scales are reliable if the measurement of an object’s weight is consistent. Of course, the scale might be ‘reliably wrong’ if the scales are badly calibrated. If findings from an assessment are replicated consistently we can say they are reliable.

     

    Validity: the quality of inferences drawn from a test. A particular assessment might provide good support for one inference but weka support for another. Often inferences drawn from assessments have low validity because the test is not testing what the user thinks it is testing.

     

    Transfer: The application of knowledge. A distinction can be made between ‘near transfer’ and ‘far transfer’. If you are transferring information to a very similar context that is ‘near transfer’ such as a question that asks about the same material in a new way. Applying information to a novel problem would be ‘far transfer’. Transfer is difficult to achieve but is the main goal of education.

     

    Inference: a conclusion reached on the basis of evidence and reasoning.

     

    Domain: Tests are about making a measurement, and generally, tests are trying to measure something huge. The technical term for what we are trying to measure is the domain. The domain that the tests are trying to measure is the extent of a pupil’s knowledge and skills and their ability to apply these in the real world in the future.










     
  • Observing Ofsted’s National Spring Conference   - Gill Coulton

    Tue 16 Oct 2018 Gill Coulton

    Observing Ofsted’s National Spring Conference   - Gill Coulton

     

    As part of JUSCO’s engagement with Ofsted – members of the steering group were offered the opportunity to observe at recent Ofsted training events.  So, despite my long held conviction never to cross to the dark side, back in April I attended the Ofsted Spring Conference for the London region … and found it to be a really worthwhile piece of CPD.      

     

    The conference training was delivered by HMI to an audience of additional inspectors, the majority of whom were school, academy-trust or LA-based practitioners. On the schedule were four sessions:

    • developing a common understanding of progress and its relationship to a high quality curriculum;

    • using this understanding of curriculum to inform inspection practice;

    • raising awareness of children who ‘fall out’ of education - evaluating schools’ use of exclusion, alternative provision and off-rolling;

    • increasing clarity on the government’s careers strategy.

     

    Whilst the training was preparatory for the forthcoming 2019 framework, there was an intention that inspectors could, within the confines of the current handbook, begin to consider its message with immediate effect.  That said, although one session dealt with practical inspection activity on evaluating the use of exclusion, the tone of the curriculum sessions was more academic than procedural. As if to underscore this, Amanda Spielman prefaced the training materials with a word on the importance of ensuring that, “all inspectors are well-equipped, intellectually and practically” for their inspection role (p2 School’s National Conference Notebook, Spring 2018). True to the academic model, inspectors had been issued with pre-reading which the trainer acknowledged to be a challenging read. Throughout the proceedings, there were strenuous efforts to avoid spawning new Ofsted myths, with reminders to inspectors to engage with the content as a whole rather than trying to reduce it to checklists and sound bites.  

     

    For the purposes of this feedback, I’ve focused on the curriculum and progress elements of the training. Whilst the entire day was interesting, these two themes were of greatest relevance to JUSCO members and were undoubtedly the heavy-weights of the day.

     

    Why foreground curriculum?

    Sean Harford has shared extensively in social media and the educational press that Ofsted are concerned with curriculum design as encapsulated by the three ‘eyes’:

    • Intent: a framework for setting out the aims of a curriculum, including the knowledge and understanding to be gained at each stage;

    • Implementation: the structure and narrative of the curriculum within the specific context of a school;

    • Impact: an evaluation of what pupils have gained against expectations

     

    Essentially, the training sought to establish the rationale for this agenda. Mention was made of Ofsted’s recent curriculum review which highlighted a lack of expertise, narrowing through teaching to the test and a tendency to confuse the curriculum with key assessments or qualifications. Excerpts from Amanda Spielman’s commentary and speeches echoed these findings as she sought to direct attention back to the core purpose or substance of education as something distinct from exam grades and progress measures; the heart of education is, she argues, “the vast accumulated wealth of human knowledge and what we choose to impart to the next generation” (p4 Conference Notebook). The other driver of the curriculum agenda is the evidence-based claims from cognitive psychology that have pushed ‘knowledge’ into the educational spotlight. Just to be clear, there was no mention of knowledge organisers. The ‘knowledge’ being described here was understanding of vocabulary, concepts, ideas and processes not just people, events and places etc; in other words, not simply a collection of discrete facts, but a deep body of knowledge that is the substance of education.    

    According to the research, the accumulation of vocabulary concepts creates interconnected knowledge webs – or schemas - which in turn enable more productive processing when new material is encountered. The more schemas a learner builds, the more numerous the possible connections to new learning become – basically a snowball effect. The retention of knowledge, therefore, becomes critical for making new learning meaningful and ensuring continued learning success. This has implications for social justice since some children are unlikely to acquire sufficient vocabulary or accumulate sufficient background knowledge outside of schooling.  

     

    Indeed the importance of vocabulary acquisition was repeatedly reinforced.  Participant activities demonstrated the importance of Tier 2 vocabulary knowledge in enabling children to comprehend new material. Texts from the 2016 KS2 reading paper were used by way of illustration: delegates had to deconstruct the passages (remember The Lost Queen and the dodo!) to reveal a multitude of vocabulary concepts drawn from wider subject learning (ancestor, extinct, predator etc), knowledge of which allowed readers to extract deeper meaning from the texts. The ‘take-away’ for inspectors was that a broad, carefully structured curriculum is essential if children are to build the background knowledge necessary for success with learning per se, and this type of reading matter specifically.  

     

    Further illustrations were provided in the tasks that followed, including examples of concept-rich Tier 2 words that could be taught during familiar topics, like Roman Britain. Ideas such as empire, invasion, good roads, wealthy landowner and so on, all of which are highly transferable concepts, were explored in terms of their relevance to other subjects and themes; the conclusion was that through careful sequencing, these ideas could be reinforced and developed across curriculum contexts.

     

    Another task invited inspectors to consider how vocabulary concepts grow in complexity: for example, the word prince, initially understood in Reception/Year 1 in a fairytale or Disney sense, might become a more sophisticated idea of monarchy by the time a child reaches Year 6, perhaps even forming part of a knowledge web that includes succession, democracy and so on. Inspectors also discussed how this knowledge could be developed across subjects in a broad curriculum; conversely where a curriculum was narrowed to be test focused, learners might well miss out on this breadth of background knowledge.  

     

    What does Ofsted mean by progress?

    How was progress defined? Well not with negative numbers, confidence intervals or a predetermined number of ‘leaps’ per year. The rationale HMI shared with inspectors again comes from cognitive psychology: ‘if nothing has changed in long term memory then nothing has been learned’ (Sweller J, Ayres P, Kalyuga S (2011) Cognitive Load Theory).  As a logical extension, Ofsted’s message was that progress means ‘knowing more and remembering more’.  A quote from David Didau was used to explore the implications of this: [learning] cannot be observed in the here and now. The only way to see if something has been retained over time and transferred to a new context is to look at what students can do later and elsewhere,’  (p4 Schools National Conference Notebook, Spring 2018).

     

    What does a well designed curriculum look like?

    HMI explicitly stated that Ofsted does not advocate any particular curriculum nor any specific curriculum model; their remit is to ascertain how well a school’s curriculum has been designed to ensure its learners gain useful, cumulative knowledge.

     

    Inspectors explored a number of defining features of a well-designed curriculum. It should not be too narrow as this would likely preclude pupils’ acquisition of wider knowledge. Content and structure need to be carefully considered: the choice of vocabulary and concepts to be taught should be useful for what is yet to come, transferable to other contexts and be of age-appropriate demand; and the sequence of learning should enable pupils to make progress by incrementally building knowledge.  Such a curriculum might address the ‘vertical’ progression within a subject but also integrate ‘horizontal’ links between different subjects in the same year group. Particularly important is how curriculum design supports children who are unlikely to acquire sufficient Tier 2 vocabulary and background knowledge outside of school.

     

    In order to gain insight into effectiveness, inspectors are likely to be interested in how school leaders articulate their decisions about the curriculum. Pertinent questions for leaders and inspectors to consider might include the following:

    • How do school leaders sequence knowledge to aid retention?

    • Is the knowledge pupils are acquiring useful in terms of generating further learning?

    • Has thought been given to the vertical, horizontal and diagonal links that integrate a curriculum?

    • In lessons, are teachers ensuring that pupils incrementally gain knowledge – are they emphasising the right content?

    • Are pupils, particularly those who may not otherwise have access, gaining cultural capital from the knowledge they are acquiring?

    • What are pupils remembering long term?

    • How do teachers and leaders know pupils are making progress by knowing more and remembering more – how do they make sure that learning sticks?

    • Is there evidence that learning is durable and lasting?

     

    Implications for JUSCO members

    Aside from the training itself, it is worth noting that the participants I spoke to were not well informed on the junior/primary data gap, though they were all secondary practitioners. Data formed no part of the April course so there was no opportunity to explore this further, but it could be an indicator that levels of awareness continue to vary across the sector. It is also worth considering whether the proportion of junior school head teachers serving as additional inspectors is similar to that of their primary counterparts. This conference alone was excellent CPD in the theory of curriculum design; the insight that Ofsted training provides must also make being on the receiving end of inspection an easier process to navigate. If junior heads are underrepresented, could it be something for JUSCO to explore?

     

    Overall, the fact that the usual performance data were not mentioned in the progress narrative was refreshing; in fact, reductive progress measures were notable by their absence. Ofsted’s adopted progress definition seems to be more grounded in learning than ‘performance’, with inspectors urged to consider the progression of knowledge in a curriculum. That said, HMI acknowledged Ofsted was busy working out how inspectors could best evaluate progress during inspections. In the foreword to the training pack, Amanda Spielman refers to the value of professional conversations between inspectors and school leaders and calls for a deeper understanding to inform inspection practice. She also references the accountability functions as intended at the time of Ofsted’s inception: published data as one discrete accountability measure, inspection as another – complementary but separate, not the latter rubber stamping the former.       

     

    There was much to consider from these sessions. From my perspective, it felt more like a drive to change the educational mindset rather than merely another round of inspection reform.  Although the way forward is bound to be fraught with difficulty for both Ofsted and for schools, I came away persuaded that those at the top of Ofsted really are trying to pursue a very different course.

     

    Going forward, for those people who want to know more, the following books were recommended to inspectors:

    • Daniel T Willingham (2009), Why Don’t Students Like School?

     
    • Peter C Brown, Henry L Roediger, Mark A McDaniel (2014), Make it stick

     
    • Isobel Beck, Margaret G McKeown, and Linda Kucan (2013), Bringing words to life: Robust vocabulary instruction   




     

     

  • Caveats in performance tables and IDSR

    Sun 02 Sep 2018 Chris McDonald

    JUSCO was pleased to receive the following message from the Department of Education, this follows meetings that the JUSCO steering group has had with Department of Education and OFSTED. JUSCO welcomes the continued opportunity to engage with both the Department and OFSTED. 

     

    ------------------------------------------

    We are pleased to confirm that we have secured agreement to update performance tables and IDSR with the following caveat for junior schools. This will be in place for data released during Academic Year 18/19  (IDSR will be updated in September and performance tables for the next releases).

     

    School Performance Tables: Junior Schools

    IDSR: Junior Schools

     

    We know from national data that pupils at junior schools, on average, have higher attainment scores at the end of key stage 2 than pupils at all other primary schools. However, on average they also have lower progress score. This may be for a variety of reasons and should be taken into account when comparing their results to schools which start educating their pupils from the beginning of key stage 1.

     

     

     

     

    We know from national data that pupils at junior schools, on average, have higher attainment scores at the end of key stage 2 than pupils at all other primary schools. However, on average they also have lower progress scores, which may be for a variety of reasons. Inspectors should be aware of this and, as with any inspection, carefully consider a range of information and data including the progress of current pupils in all year groups.

     

     

     

  • Watch OFSTED's Sean Harford's presentation from our JUSCO 2018 conference on Curriculum, intent, implementation, impact

    Sun 29 Jul 2018 Chris McDonald

    Watch Sean Harford, OFSTED's National Director, Education, give his presentation on OFSTED's work on curriculum. This film is from our April 2018 conference, you can also download the slides in pdf format. 

     

    http://www.juniorschoolcollaboration.co.uk/ofsteds-work-on-curriculum-with-sean-harford-natio/

     

  • OFSTED Inspection outcomes by school type and IDACI up until 31st May 2018 #JUSCO

    Sun 15 Jul 2018 Chris McDonald

    A huge thank you to OFSTED for sharing this with us, particular thanks to Sean Harford and to Oli Baytun. 

     

    Looking at the charts below the groupings of schools into infant/junior/other are based on the statutory age ranges of each school, rather than the actual numbers of pupils in each school at a point in time.

     

    Deprivation is based on the Income Deprivation Affecting Children Index (IDACI) 2015. The deprivation level of a school is based on the mean of the deprivation indices associated with the home post codes of the pupils attending the school - rather than the location of the school itself.

     

    The IDACI bands are the same ones that OFSTED show in their published Management Information and Official Statistics. They are calculated based on all primary schools, rather than tailored quintiles produced for each group of schools/each chart. This means that there are different numbers of schools in the quintiles in some of the charts. However, OFSTED point out, one advantage of using the same quintiles on each chart is consistency – so if a school is in the ‘most deprived’ band in the infant chart it is also in the ‘most deprived’ band in the ‘all primary schools’ chart.

     

    IDACI quintiles and inspection outcomes are based on OFSTED's end of May Management Information here:

     

    https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/715338/Management_information_-_schools_-_31_May_2018.xlsx

     

    OFSTED point out that following a consultation they have recently changed their methodology for reporting on inspection outcomes in their official statistics data, to include the inspection outcomes for predecessor schools. An update based on the new methodology is coming soon. 

  • Liz Twist from the NFER explains the NFER Junior School Baseline Project - full film from our April conference.

    Sun 08 Jul 2018 Chris McDonald

    Liz Twist from the NFER explains the NFER Junior School Baseline Project in this film from our April conference. Find out more here: 

    http://www.juniorschoolcollaboration.co.uk/nfer-junior-school-baseline-project/

All of our films and presentation downloads are now available in our member's area.
Top