Skip to main content
Tag

QTS Q13

Using National and Statistical Data GCSE ICT

By PGCE, POT 3, TeachingNo Comments

Using Local and National statisitical data in my teaching

I have been looking at how pupils at GCSE perform in ICT over this year. I have noticed that there is a trend that boys tend to under perform in the course work related subjects of which one is the GCSE OCR national’s option.

According to the article from the bbc below it indicates how boys are under achieving. I also have had a look at the results from my home school http://bit.ly/mPy6DD

Gender gap

 

Boys continue to lag behind girls in most subjects, a trend of more than two decades.

The gender gap was at its widest in the early 2000s for A*-Cs, and in the middle of the decade for A*-As.

Explanations for the gap vary, although it is widely thought that girls perform better than boys in continued assessment and coursework. Other reasons given for the gap in GCSE performance are maturity and motivation.

However, a higher percentage of boys get A and A* grades in economics and additional maths than of girls. Boys also pulled ahead of girls at these top grades in maths last year, and in physics at A grade – and managed to maintain the edge in both in 2010.

 

The main solution to tackle boys under achieving is to chunk and segment coursework into manageable tasks that can be achieved each lesson one by one. We have been creating out OCR nationals work with small slides that each pupil can complete each lesson and I have been implementing this in my year 9 teaching of databases.

I have personally seen how this really works well for in class activities however it is tough when pupils don’t completed all the work that they need to do in lesson as this as ramifications later on in term as areas can depend on other areas.

Please see my POT 3 TEACHING for YEar 9

Use of Data in Independent Schools

By PGCE, POT 2, TeachingNo Comments

Yesterday I attended a training session on how KES use data in the school to monitor and track pupils’ ability.

When pupils enter the school at year 7 there are up to 100 new pupils and in order to assess what ability level they are. The test is by a Durham company called MidYis http://www.midyisproject.org/ which basically assess pupils’ inelegance and ability in various areas such as English and maths. Apparently the testing is helpful to indicate and show the more able and less able pupils at the top and bottom end of the spectrum. The top students score about 140 points and the bottom students score around 70 points. The problem they find is that the correlation on a graph on what they are achieving and what they should achieve is quite broad as they are at the top end of the spectrum of the ability when you compare against the national averages. KES find it important to compare their results against other Independent schools as they are way above average when you look at national averages. Most of the pupils are strong and capable which makes it hard to distinguish between the two when it comes to the middle of the road cohort. The main purpose of this test not for pupils but to track how certain departments are doing when they take the MidYis in year 7 compared to the MidYis results in year 9. Each testing session costs £500 per year group.

KES find their own in house results more useful when it comes to tracking pupils’ progress in a statistical manner. They use their SIMS system to set targets based on 3 years of previous results and then plotting graphs with what they should be achieving based on what they are achieving. This basically shows how departments are doing and if they are under or over achieving. It is sometimes hard to use it as gospel as exam years change particularly in subjects like English literature.

The main point that is important here is that according to KES they don’t want to let the data and results that are mathematically calculated dictate how well their pupils are doing and they only use the results from the SIMS and exam results as an avenue in to show pupils where they are at and what they could be achieving. It is great to see that that they are trying to steer away from being completely data driven.

From my point of view it is really interesting to see how this statistical data changes between independent schools and maintained schools and I can personally see the difference when teaching in a academically selective school as the pupils are more able and you need to set work at a higher level to stretch and expand their ability. I have seen in comparison to Norton Hill School that the ability is lower and I have had to adjust my teaching to cater for these pupils. This is not a problem or an issue but just something that I have been learning particularly when it comes to differentiation in the classroom.