M. RAMATLALA ET AL.
that BGCSE examination involves, plus incorporating the prac-
tical work aspects of the subject. Over and above the contribu-
tion of continuous assessment, forecast grades accounted for
some significant percentage (9% to 11%; see Table 5) of the
criterion variance in each of the three predictions based on the
four years’ data. This is partly due to the significant amount of
variance it shares with continuous assessment (2.75% to 4.50%)
in the three predictions and partly due to it seemingly lack of
content validity as it does not involve the practical aspects of
physical education. But for these, given its close resemblance to
the criterion, one would have expected forecast grades from
Mock examination to account for more of the criterion variance.
Predicting BGCSE performance in physical education from
grades in coursework and forecasting in Botswana differenti-
ates between male and female students and hence lead to bias in
any selection if such prediction is based on cumulative data.
The significantly different prediction validities (.473 for males
and .551 for females; Z = –2.53, p < .05) is indicative of bias in
prediction if and when cumulative data across the years are
used.
The story was not quite the same if the prediction is based on
yearly data. In none of the y ears is such differentially significant
prediction observed (see Table 5). For each of these years,
coursework grades and forecast grades were seen to individually
and combinely predict grades in BGCSE physical education sig-
nificantly with predictive validities ran ging from .457 to .591 fo r
coursework grades, .192 to .368 for forecast grades, and
from .521 to .691 for both predic tors combined. Though the sig-
nificant predictive validities of each and both coursework and
forecast grades persisted across each of the four years, the sig-
nificant differential prediction was not observed for any of the
years considered separately. One can then say that the observed
significant differential prediction for the combined four years’
data is an artifact of the increased sample size. In other words, the
likelihood of improved predictive validity was enhanced by com-
bining data for four years and hence increasing the sample size.
It could be concluded that in Botswana senior secondary
schools, for students’ performance in physical education be-
tween the years 2005 to 2008, coursework grades and forecast
grades, individually and combinely, were significant predictors
of BGCSE performance. They also showed gender-based sig-
nificant differential prediction, over-predicting the criterion for
females while under-predicting for male students. But when the
analyses were done for each of the four years, such gender-
based significant differential prediction was not observed for
any of the four years.
Recommendations
Given that the results of predicting BGCSE grades may be
used in flagging students whose BGCSE grades might be re-
viewed (Masole & Utlwang, 2005), Botswana Examination
Council should ensure that the prediction of grades in this ex-
amination should be done with measures that will ensure no
gender-based differential prediction. This would mean taking
measures and procedures that enhance the predictive validities
of each of the predictor variables through improving their con-
tent validities at the classroom levels.
Similar studies as to the validity of coursework, forecast grades
and even other measures in predicting BGCSE grad es in different
subjects should be carried out. Other variable-selecting methods
of multiple regression analy sis could be used instead of the step-
wise method used in this st udy. Such studies should also be car-
ried out with other variables like ethnicity, location and socio-
economic levels as factors in the prediction of performance in
BGCSE physical education examination by coursework and
forecast grades among secondary school students in Botswana.
REFERENCES
Adeyemi, T. O. (2010). Credit in mathematics in senior secondary
certificate examinations as a predictor in educational management in
universities in Ondo and Ekiti States. Nigeria. Middle-East Journal
of Scientific Research, 5, 235-244.
Adeyemi, T. O. (2008). Predicting students’ performance in senior
secondary certificate examinations from performance in junior sec-
ondary certificate examinations in Ondo State, Nigeria. Humanity &
Social Sciences Journal, 3, 26-36.
Akbari, R., & Alivar, N. K. (2010). L2 teacher characteristics as pre-
dictors of students’ academic achievement. The Electronic Journal
for English as a Second Language, 13, 1-22.
Gardner, D., & Deadrick, D. L. (2008). Understanding of performance
for US minorities’ cognitive ability measures. Equal Opportunity In-
ternational, 27, 455-464. doi:10.1108/02610150810882305
Iramaneerat, C. (2007). Rater errors in a clinical skills assessment of
medical students. Evaluation & the Health Professions, 30, 266-283.
doi:10.1177/0163278707304040
Kobrin, J. F., Patterso n, B. F ., Shaw, E. J ., Mattern, K. D., & Barb uti, S.
M. (2008). Validity of the SAT® for predicting first-year college
grade point average. College Board Report No. 2008-5.
Kyei-Blankson, L. S. (2005). Predictive validity, differential validity,
and differential prediction of the subtests of the medical college ad-
mission test. Unpublished Doctoral Dissertation, Athens, OH: Ohio
University.
Little, C. (1992). GCSE coursework: A retrospective assessment.
Teaching Mathematics and Its Applications, 11, 56-63.
doi:10.1093/teamat/11.2.56
Masole, T. M., & Utlwang, A. (2005). The reliability of forecast grades
in predicting students’ performance in the final Botswana General
Certificate of Secondary Education Examinations.
Nenty, H. J. (1979). An empirical assessment of the culture fairness of
the Cattel Culture Fair Intelligence Test using the Rasch Latent Trait
Measurement Model. Unpublished Doctoral Dissertation, Kent, OH:
Kent State University.
Ramatlala, M. S. (2009). The validity of coursework scores in predict-
ing performance in Botswana General Certificate of Secondary
Education Physical Education examinations among senior secondary
school students in Botswana. Unpublished Master’s Thesis, Gabo-
rone: University of Botswana.
Robins, K. (1997). What in the world’s going on? In P. D. Gay (Ed.),
Production of culture/cultures of production. London: Sage/Open
University.
Thobega, M., & Masole, T. M. (2008). Relationship between forecast
grades and component scores of the Botswana General Certificate of
Secondary Education Agriculture.
Young, J. W. (2001). Differential validity, differential prediction, and
college admission testing: A comprehensive review and analysis.
College Board Research Report No. 2001-6. New York: College En-
trance Examination Board.
Young, J. W. (1991). Gender bias in predicting college academic per-
formance: A new approach using item response theory. Journal of
Educational Measurement, 28, 37-47.
doi:10.1111/j.1745-3984.1991.tb00342.x
Copyright © 2012 SciRes. 37