Creative Education
2012. Vol.3, No.2, 232-240
Published Online April 2012 in SciRes (http://www.SciRP.org/journal/ce) http://dx.doi.org/10.4236/ce.2012.32037
Copyright © 2012 SciRes.
232
The Effect of an Instruction Designed by Cognitive Load
Theory Principles on 7th Grade Students’ Achievement
in Algebra Topics and Cognitive Load
Aygil Takir1, Meral Aksu2
1Turkish Education Association, Ankara, Turkey
2Faculty of Education, Department of Educational Sciences, Middle East Technical University (METU),
Ankara, Turkey
Email: aygilt@gmail.com, aksume@metu.edu.tr
Received March 14th, 2012; revised April 9th, 2012; accepted April 19th, 2012
The purpose of this study was to investigate the effect of an instruction designed by the Cognitive Load
Theory (CLT) principles on 7th grade students’ achievement in Algebra topics and cognitive load. A
quasi-experimental study was conducted in totally six weeks with 80 students. The instruction designed
by CLT principles was used in the experimental group, while the instruction recommended by the Minis-
try of Education (MONE) was used in the control group. Researchers developed Teachers’ Guidelines and
Students’ Booklets for using in the experimental group. At the end of each unit, the Subjective Rating
Scale (SRS) was used to measure students’ cognitive load. At the end of the treatment, the Algebra
Achievement Test (AAT) was administrated to both of the groups. Both descriptive and inferential statis-
tical techniques were used for analyzing data. Results showed that instruction designed by CLT principles
was effective for the Algebra teaching with the limitations of the study.
Keywords: Cognitive Load Theory; Cognitive Load; Subjective Measure of Cognitive Load; Algebra
Achievement; Efficiency of Instruction
Introduction
The Cognitive Load Theory (CLT) has emerged over the last
decade as an influential theory of educational psychology and
instructional design. The CLT originated in the 1980s through
the work of John Sweller and his colleagues at the University of
New South Wales (Clark, Nguyen, & Sweller, 2005; Paas,
Renkl, & Sweller, 2003).
The CLT is a theoretical framework grounded in the learner’s
cognitive architecture (Janssen, Kirschner, Erkens, Kirschner,
& Paas, 2010) that assumes that working memory (WM) is very
limited in terms of being able to store and process information
(Cowan, 2005; Miller, 1956; Paas, Van Gog, & Sweller, 2010)
and long term memory (LTM) has an unlimited capacity, being
able to store an almost limitless amount of information. The
CLT predicts learning outcomes by taking into consideration
the capabilities and the limitations of this architecture (Plass,
Moreno, & Brünken, 2010).
As understood from its definition, CLT differs from other in-
structional theories with its emphasis on human cognitive ar-
chitecture. It considers knowledge of human cognitive archi-
tecture to be critical for instructional design and effectiveness
of an instruction depends heavily on whether it takes the char-
acteristics of human cognition into account.
Cognitive load can be defined as a multidimensional con-
struct representing the load that performing a particular task
imposes on the learner’s cognitive system (Paas, Tuovinen,
Tabbers, & Van Gerven, 2003; Paas & Van Merrienboer, 1994).
The roles of the WM and LTM in human architecture allow to
categorize the source of the cognitive load as intrinsic, extra-
neous, and germane cognitive loads (Paas et al., 2003). If load
is imposed by the number of information elements and their
interactivity, it is called intrinsic (ICL). If it is imposed by the
manner in which the information is presented to learners and by
the learning activities required of learners, it is called extrane-
ous or germane. Whereas extraneous load (ECL) is imposed by
information and activities that do not contribute to the proc-
esses of schema construction and automation; germane load
(GCL) is related to information and activities that foster these
processes (Paas, Renkl, & Sweller, 2004).
There are many assessment techniques of the cognitive load.
The common technique among the CLT researchers is subjec-
tive rating scale technique which is based on the assumption
that “people are able to introspect on their cognitive processes
and to report the amount of mental effort expended” (Paas et al.,
2003). Paas and van Merrienboer (1994) have used a rating
scale for measuring perceived task difficulty by using a 9-point
Likert scale ranging from very, very low mental effort (1) to
very, very high mental effort (9).
Efficiency is an important issue that should be considered in
CLT research. It can be defined as a property of instructional
products that results in faster learning, better learning, or both
(Clark et al., 2005). Paas and Van Merrienboer (1993) suggests
a calculation approach for combining the measures of mental
load and performance that allows to obtain information on the
relative efficiency of instructional conditions. Particularly, the
combination of performance and cognitive load measures has
been identified to constitute a reliable estimate of the cognitive
efficiency of instructional methods (Paas et al., 2003). Higher
learning outcomes with less cognitive effort are more efficient
A. TAKIR ET AL.
than environments that lead to lower outcomes with greater
mental effort (Clark et al., 2005). To quantify the efficiency,
instructional scientists Paas and van Merriënboer’s (1993) use
efficiency metric. Their approach provides a tool to relate cog-
nitive effort to performance measures. In this approach, high-
task performance associated with low effort is called high-
instructional efficiency, whereas low-task performance with
high effort is called low-instructional efficiency. This metric is
calculated by subtracting cognitive load from the performance
outcomes.
The CLT has generated a range of techniques intended to
achieve the purpose of reducing the ECL and maximizing the
GCL. CLT was applied in several contexts and studied by using
randomized, controlled experiments. The empirical results of
these studies led to the demonstration of several instructional
techniques which are called CLT Effects. As discussed in pre-
vious paragraphs, CLT differs from the other instructional
theories with its emphasis on human cognitive architecture,
further; CLT differs from the other instructional theories by the
methodology it uses.
There are many CLT effects that instructional designers can
consider when they plan an instruction. One of them is the
worked example effect which was also the main effect of this
study. It is a technique that decreases the ECL by replacing
some practice exercises with a series of worked examples, each
followed by similar practice exercise (Clark et al., 2005). A
completion problem is a partial worked example where the
learner has to complete some key solution steps (Sweller, Ayres,
& Kalyuga, 2011a). It is a hybrid between practice assignment
and a worked example. Like worked example, completion ex-
amples reduces cognitive load; schemas can be acquired by
studying the work-out portions (Clark et al., 2005). Completion
examples were used in this study especially the topics that
thought to be difficult, to satisfy smooth transition from exam-
ples to practice exercises.
The basic underlined principle of the modality effect is that
“complex visuals are understood more efficiently when ex-
planatory words are presented in an audio modality than when
presented in a written modality” (Clark et al., 2005). Therefore,
it is very important to consider verbal explanations during the
lesson hours. The redundancy effect of the CLT occurs when
unnecessary, additional information is presented to learners
(Sweller, 2010) or when identical information is presented in
multiple forms, which decreases rather than increases learning
(Pawley, Ayres, Cooper, & Sweller, 2005). Redundancy effect
is an important factor when designing instruction (Chandler &
Sweller, 1991) and, in its many forms, has a detrimental effect
on learning (Kalyuga, Chandler, & Sweller, 2001; Sweller &
Chandler, 1991). Expertise reversal effect occurs when an in-
structional procedure that is effective for novices in comparison
to an alternative instruction that becomes less effective as ex-
pertise increases. This effect is very important especially in
classrooms where there are mixture of novice and experienced
learners. Expertise reversal effect has important considerations
together with the guidance fading effect of the CLT. This effect
suggests that learners should first be presented worked exam-
ples, followed by completion problems and then full problems
assignments (Renkl & Atkinson, 2003) which was used in this
study.
As described previously, the human cognitive architecture is
concerned with the manner in which cognitive structures are
organized. The relations between the WM and the LTM, in
conjunction with the cognitive processes that support learning,
are of critical importance for designing an instruction. Kir-
schner et al. (2006) expressed that the architecture of the LTM
provides the ultimate justification for instruction: the aim of all
instruction is to alter the LTM. If nothing has changed in the
LTM, nothing has been learned. They concluded that any in-
structional recommendation that does not specify what has been
changed in the LTM, or that does not increase the efficiency
with which relevant information is stored in or retrieved from
the LTM, is likely to be ineffective.
From the WM perspective, Kirschner et al. (2006) described
that any instructional theory that ignores the limits of WM
when dealing with novel information or ignores the disappear-
ance of those limits when dealing with familiar information is
unlikely to be effective. They maintain that learners, especially
novices, are unable to effectively process information due to the
limits of WM, hence the learning suffers (Tobias & Duffy,
2009).
The human cognitive architecture has obvious implications
for the amount of guidance and assistance provided to the
learners. Instruction should be explicit and clear. Based on this
architecture, there seems to be no purpose or function to with-
holding information from the learners so that they can discover
it for themselves (Tobias & Duffy, 2009). According to Clark et
al. (2005) the learners’ engagement of an instruction is not
directed toward schema acquisition and automation; so it can
impose an ECL. They suggested the use of directive rather than
student centered approaches for novice learners. As they stated,
the instructional designers should prepare directive lessons for
novice learners that provide brief content segments including
explanations, examples, and practices, and further, the instruc-
tional designers should prepare more student centered lessons
for more experienced learners.
The worked example, completion example, split attention,
modality, expertise reversal, redundancy, and guidance faded
effects of the CLT are the central CLT effects of this study and
were used for preparing classroom materials and implementa-
tion process. By considering the CLT effects and human cogni-
tive architecture principles, an instruction was designed and
implemented in 7th grade Mathematics classroom. The purpose
of the study was to investigate the effect of this instruction
designed by the CLT principles on 7th grade students’ achieve-
ment in Algebra topics and cognitive load. More specifically,
the study aimed to determine whether the CLT treatment has an
effect on students’ Algebra achievement and cognitive load and
find out the efficiency score of the instruction in terms of the
students’ achievement and cognitive load measures.
The study aimed to answer the following questions:
1) Does the instruction designed by CLT have significant ef-
fect on 7th grade student’s Algebra achievement and cognitive
load?
2) Is there a significant difference between the efficiency
scores of students who were exposed to instruction designed by
CLT principles compared to instruction recommended by
MONE?
The following hypotheses that are stated in null form tested
the research questions given above.
Null Hypothesis 1.1: There is no significant mean difference
between group of students who were exposed to instruction
designed by CLT principles and to instruction recommended by
MONE.
Null Hypothesis 2.1: There is no significant difference be-
Copyright © 2012 SciRes. 233
A. TAKIR ET AL.
tween the efficiency scores of students who were exposed to
instruction designed by CLT principles and to instruction rec-
ommended by MONE.
Method
Subjects of the Study
The subjects of the study were 80 7th grade students in a
public school in İstanbul, Turkey. 40 of the students were in
classroom called 7A and the other 40 were in 7B. Groups were
assigned as experimental and control randomly. General distri-
bution of the subjects of the study was shown in Table 1.
Design and Procedure
The instruction designed by the CLT effects was defined as
the independent variable; students’ achievement in Algebra
topics and cognitive load were defined as the dependent vari-
ables. The instruction designed by CLT effects was used in the
experimental group, while the instruction recommended by the
Ministry of Education (MONE) was used in the control group
of the study.
The study was conducted from the first week of December
2010 to 3rd week of January 2011 (totally six weeks-19 class
hours). The students were assigned to classes randomly by the
school administration at the beginning of the school semester.
Therefore, random assignment of subjects to the experimental
and control groups was not possible, and as a result, a quasi-
experimental research design was utilized in the study. For the
same reason, the classes were heterogeneous in terms of gender
and academic achievement, and homogeneous in terms of so-
cioeconomic status (SES), all being from low SES.
The equality of two groups defined above was controlled by
comparing students’ previous year mathematics achievement
grades (MAG). The 6th grade mathematics achievement grades
of the subjects were obtained from the grade report forms of the
school records. An independent-samples t-test was conducted to
control the equivalence of groups. The students’ previous year
mathematics grades were similar in both experimental and con-
trol groups. Both groups were taught the same mathematics
content; but in experimental group, an instruction designed by
the CLT principles was applied.
The study was carried out in six Algebra Units: 1) Exponents;
2) Operations on Algebraic Exponents; 3) Equations; 4) Carte-
sian Coordinate Plane; 5) Graphs of the First Degree Equations
in One Unknown; and 6) Patterns and Relations. For each unit,
the researchers developed “Teacher Guidelines” and “Student’s
Booklets” for using in experimental group.
Students’ Booklets and Teachers’ Guidelines
Students’ Booklets were developed by the researchers as
classroom materials which covered all Algebraic units of the
7th grade Mathematics curriculum. For developing each unit,
the objectives of the 7th grade mathematics curriculum and
CLT effects were considered. The literature on CLT was re-
viewed at the beginning of the development of the student’s
booklets. The CLT effects used in the Student’s Booklets were
as follows: 1) worked example effect; 2) completion example
effect; 3) split-half effect; 4) modality effect; 5) redundancy
effect; and 6) guidance fading effect. Six Students’ Booklets
were developed together with Teachers’ Guidelines.
Table 1.
Subjects of the study.
Gender GroupExperimental
Group Control Group Total
Male 19 20 39
Female 21 20 41
Four mathematics teachers checked the booklets in terms of
their content, appropriateness of the language used, the grade
level of students, and the CLT effects. Some modifications such
as wording and ordering exercises according to CLT were made
according to their criticisms and suggestions. Later an expert in
department of Secondary School Science and Mathematics
Education controlled Students’ Booklets and made some sug-
gestions about the general view of the booklets, their content
and writing form for some of the exercises.
“Teacher’s Guidelines” were developed for the experimental
group teacher in order to help her manage the instruction.
Teacher’s Guidelines were prepared by the researchers and
covered the titles “explanations, duration of lesson, prerequi-
sites of the lesson, objectives of lesson, classroom materials,
and implementation of the lessons”. In the “explanation” part,
which effects of the CLT were used and how they were imple-
mented were clarified for the teachers. In the “implementation
of the lesson part”, the implementation of the unit was ex-
plained from beginning to end by considering CLT effects and
principles.
Instruction in Experimental Group
All the activities in the experimental group were prepared by
using the CLT principles and effects. According to the CLT, the
first step of instruction is the determination of the students’
prior knowledge. The classroom was heterogeneous in terms of
achievement, increasing the importance of the prior knowledge
of students. This issue was related with the expertise reversal
effect of the CLT. At the beginning of each lesson, the teacher
asked some questions or made revisions about the topic in order
to refresh the students’ memory and activate their prior know-
ledge.
After the repetition of some prerequisite learning, for each
Algebra topic, the teacher distributed Students’ Booklets and
carried out the lesson by following the content of the booklets.
The worked examples included in booklets were the starting
point of each lesson. The teacher explained each topic on the
worked examples verbally (modality effect) and then assigned a
similar practice exercise to the students for satisfying schema
construction. The teacher gave the students enough time to
complete practice exercises and always observed the classroom
in order to help the students. For some Algebra topics such as
Equations, the worked examples were not directly followed by
the practice exercise; instead, completion examples were used.
By using the completion examples, smooth transition from the
worked examples to practice exercises were satisfied. Similarly,
in the beginning of each lesson, the teacher explained worked
examples verbally; then the teacher gave students time to com-
plete the missing steps of the completion examples; and finally,
practice exercises were assigned. The use of worked examples,
completion examples, and practice exercises was related to
backward fading effect of the CLT, which allowed accommo-
dating a gradual learning process. By considering this effect,
Copyright © 2012 SciRes.
234
A. TAKIR ET AL.
the WM overload was diminished.
In all Algebra topics, for satisfying the split-attention effect
of the CLT, the teacher did not write anything on the board.
Further, Students’ Booklets were considered as content sum-
maries and prevented note-taking for satisfying split attention
effect of the CLT. Redundancy in training refers to proving
more expressions of content than needed for understanding. At
the beginning of each Algebra topic, any picture, extra explana-
tion or classroom activity were not considered to satisfy the
redundancy effect of the CLT. For this effect, information was
presented to students through one mode; in other words, the
teacher explained the content either virtually or in audio format.
According to the CLT, after the students gain experience re-
lated with a topic, worked examples and completion examples
become detrimental rather than beneficial for learning. For this
reason, after the students gained some experience related with
Algebra topics, full exercises or classroom activities/group works
were assigned for the students. The same procedure was fol-
lowed for all topics covered.
Data Collection Instruments
Algebra Achievement Test
Algebra Achievement Test (AAT) was developed by the re-
searchers for determining the 7th grade students Algebra achi-
evement and consists of 20 open ended questions related to the
Algebra topics. All questions in the test were developed by
considering the objectives and content of the 7th grade Mathe-
matics Curriculum related to the Algebra topics. The test covers
the questions 1) finding value of an expression; 2) simplifying
expressions; 3) writing the algebraic expression of a pattern; 4)
solving equations; 5) solving problems by using equations; 6)
constructing problems; 7) determining a point on Cartesian
Coordinate plane; and 8) sketching graphs.
The possible scores of the test range from 0 to 80. A rubric
was created for scoring the test by the researchers. For each
question, a five-score level (0 - 4) was assigned. The highest
score of 4 was awarded for responses that the researchers re-
gard as being entirely correct and satisfactory answer, while the
lowest score of 0 was reserved for no answer or completely
wrong answer.
Four mathematics teachers checked the test and rubric for
content validity evidence by comparing the items with the ob-
jectives. “Expert Opinion Form (EOF)” was developed for the
test according to the units, objectives, and class level, and its
clarity and general outlook. Moreover, the EOF was sent to the
two mathematics education faculty staff to check the appropri-
ateness, relevance, and consistency of the questions with the
nature of CLT. Some revisions were made on the wording of
the questions, taking into account of the experts; make them
clearer and more suitable for the learning outcome being meas-
ured.
The AAT was piloted with 229, 8th grade students from
three public schools in the first semester of 2009-2010 aca-
demic year. The purpose of the piloting was to check the clarity
of the questions, to make sure the adequacy of the test duration,
and to check reliability (internal consistency of the test). After
the implementation, the researchers graded the test by taking
the rubric into consideration. Upon the completion of grading
by the one of the researchers, another mathematics teacher
scored randomly selected 80 tests. The inter rater reliability
analyses was conducted by using Statistical Package for Social
Sciences (SPSS) 15.0 for Windows. Inter-rater reliability coef-
ficient by means of intra-class correlation (ICC) was computed
in order to establish the extent of consensus on the use of the
scoring rubric for the test. The ICC value of the test was 0.90,
which indicated high reliability and internal consistency of
scoring rubric as used by two raters.
The Subjective Rating Scale of Cognitive Load
The one item Subjective Rating Scale (SRS) developed by
Paas and Van Merrienboer (1993) was used to measure stu-
dents’ cognitive load for both experimental and control group
students. It was implemented at the end of each Algebra unit.
The SRS was a 9-point symmetrical rating scale, ranging from
1 (very, very low mental effort) to 9 (very, very high mental
effort). The reliability coefficient of the scale was found 0.82
by Paas and Van Merrienboer (1993). The Turkish adaptation
of the scale was developed by Kılıç and Karadeniz (2004). For
the translation of the test to Turkish and its clarity, Kılıç and
Karadeniz took expert opinion and prepared a form for piloting
(Sezgin, 2009). For the internal consistency of the test (reliabil-
ity), they conducted a study and found 0.90 Cronbach Alpha
reliability coefficient which was indicated as high reliability.
Sezgin (2009) also conducted a study to find the internal con-
sistency of the SRS and found 0.78 Cronbach Alpha reliability
coefficient which was indicated as moderate reliability.
Performance is measured by a test taken at the end of the
lesson. Cognitive load is most commonly measured by learner
estimates of lesson difficulty. The difficulty (cognitive load) of
lesson is assessed by using a 1 to 7 or 1 to 9 scales. In this study,
performance was measured by a test taken at the end of the
implementation (AAT) and cognitive load was measured by
learner estimates of lesson difficulty (SRS).
Results
The Descriptive Results of the Algebra Achieveme nt
Test (AAT) and Cognitive Load Sc ores (SRS)
The descriptive statistics related to the student’s AAT and
SRS for each of the experimental and control groups are given
in Table 2.
Descriptive statistics revealed that in terms of the AAT
scores, the students in the experimental group (M = 41.18, SD
= 12.86) had much higher scores than the students in the con-
trol group (M = 14.88, SD = 8.28). In terms of the SRS scores,
the experimental group students (M = 3.58, SD = 1.55) had
lower scores than the control group students (M = 5.05, SD =
1.39).
The mean scores of cognitive load was calculated by taking
the mean of collected cognitive load data by using SRS at the
Table 2.
Mean and Standard deviation of AAT and SRS scores.
AAT* SRS**
Groups
M SD M SD
Experimental41.18 12.86 3.58 1.55
Control 14.88 8.28 5.05 1.39
Note: *Total score of the AAT is 80; **9-point scale.
Copyright © 2012 SciRes. 235
A. TAKIR ET AL.
end of each of six Algebra topics. Cognitive load score ranging
between 1 - 4 denotes low cognitive load and those ranging
between 5 - 9 denotes high cognitive load conditions (Paas &
Van Merrienboer, 1993). According to the cognitive load
ranges, there were 7 students in high cognitive load condition in
the experimental group, while there were 23 students in high
cognitive load condition in the control group. In contrast, there
were 33 students in low cognitive load condition in the experi-
mental group, while 17 students were in low cognitive load
condition in the control group. Hence, the number of students
with a high cognitive load was greater in control group than in
the experimental group.
Multivariate Analysis of the Variance (MANOVA):
Investigation of the Effects of an Instruction Designed
by CLT Principles on Students’ Algebra Achievement
and Cognitive Load
MANOVA was conducted to find the answer of the research
question “does the instruction designed by CLT have signifi-
cant effect on 7th grade student’s Algebra achievement and
cognitive load?”
Prior to conducting MANOVA, the assumptions underlying
this technique, namely the independence of observations, mul-
tivariate normality, homogeneity of variance-covariance, inter-
val/ratio scale on dependent variables, and outliers (Tabachnick
& Fidell, 2001) were checked in order to explore the appropri-
ateness of the data for running MANOVA.
The independence of observations assumption was met since
different groups did not affect each other when answering the
items in the tests used for this study.
Multivariate normality requires that the sampling distribu-
tions of the means of the dependent variables in each cell and
all combinations of them are normally distributed (Tabachnick
& Fidell, 2001). In order to check univariate normality assump-
tion, skewness-kurtosis values, Q-Q plots and Kolmogorov-
Smirnov and Shapiro-Wilk’s tests were examined (Field, 2009).
Skewness-Kurtosis values were not quite away from 0 for each
group implied the normal distribution. The points on Q-Q plots
for the cases fall along the diagonal running from lower left to
upper right, with some minor deviations due to random proc-
esses implies the normality of the distribution (Tabachnick &
Fidell, 2001). The Q-Q plots of the variables indicated the nor-
mal distribution. Further, the results of the Kolmogorov-Smir-
nov and Shapiro-Wilk’s tests also indicated the normal distri-
bution. Mardia’s test was used to examine multivariate normal-
ity. The test revealed a non-significant result indicating normal
multivariate distribution.
The outliers are observations with a unique combination of
characteristics identifiable as distinctly different from the other
observations (Hair Jr., Anderson, Tatham, & Black, 1995). Be-
cause MANOVA is a multivariate analysis, multivariate out-
liers are of special importance (Stevens, 1996). In order to ex-
amine the data for multivariate outliers, Mahalanobis Distance
(D2) was used. D2 is a measure of distance in multidimensional
space of each observation from the mean center of multidimen-
sional centrality (Hair et al., 1995). The analysis revealed that
the data has no cases with D2values greater than the critical
values of 13.60 (2.77) for the alpha set as 0.05. Therefore, there
were no multivariate outliers.
The assumptions of homogeneity of variance and covariance
is that the variance and covariance matrices within each cell of
the design are sampled from the same population variance-
covariance matrix and can be reasonably pooled to create a
single estimate of error (Tabachnick & Fidell, 2001).
The equality of variance assumption was satisfied by the re-
sult of the Levene’s test of equality of error variances (Field,
2009). Levene’s test was found to be non-significant for both
Algebra achievement and cognitive load scores (Table 3).
The homogeneity of covariance matrices was checked by
using Box M test (Field, 2009). Results of the Box M Test
showed that homogeneity of covariance assumption was vio-
lated for the analysis, F(3, 1095120) = 2.70, p < 0.05. Consid-
ering the result of Box M and the equality of the sample cells
the robustness cannot be guaranteed (Tabachnick & Fidell,
2001). Hence, Pillai’s Trace test was used instead of Wilks’
Lambda to evaluate multivariate significance.
The dependent variables (AAT and SRS scores) were meas-
ured on a continuous scale, so the interval/ratio scale on de-
pendent variables assumption was met.
The results of the MANOVA test, Pillai’s Trace, was sig-
nificant, F(2, 77) = 72.687, p < 0.05, indicating that the popula-
tion means on the Algebra achievement and cognitive load
scores were different for the two groups. The multivariate eta
squared 0.65 indicated that 65 percent of multivariate variance of
the AAT and SRS scores were associated with the group factor.
Since a significant result was obtained on the multivariate
test (Table 4), it was required to check univariate analysis
ANOVA in order to understand the effect of instruction on
AAT and SRS scores. The results of the univariate ANOVAs
are shown in Table 4. The univariate ANOVAs for AAT and
SRS scores were both significant, F(1, 78) = 118.26, p < 0.05 and
F(1, 78) = 19.86, p < 0.05, respectively. In terms of the vari-
ance explained, groups explained 60 percent of the variance in
AAT scores whereas the groups explained 20 percent of the
variance in SRS.
The Results of the Efficiency of Instruction
The second research question of the study was “is there a
significant difference between the efficiency scores of students
who were exposed to instruction designed by CLT principles
and to instruction recommended by MONE?”
In order to quantify the effects of instruction designed by
CLT principles and by MONE on students’, the following effi-
ciency formula was used (Clark et al., 2005; Paas et al., 2003).
Algebra achievement and cognitive load, the AAT and SRS
scores of each subject in experimental and control group were
standardized by converting z-scores (Clark et al., 2005; Paas et
al., 2003). The data given in Table 5 shows the mean of AAT
z-scores, SRS z-scores, and the efficiency values for both ex-
perimental and control groups calculated by the above formula
(Kablan & Erden, 2008; Paas et al., 2003; Sezgin, 2009).
To check the hypothesis that experimental group’s efficiency
scores of students was significantly different from the control
group’s efficiency scores of students; an Independent Samples
t-test was conducted. The efficiency value of each student was
calculated by using the above formula and putting SRS z-score
and AAT z-score of each individual student on the efficiency
formula. The result of the Independent Samples t-test is pre-
sented in Table 6.
The results of the Levene’s test evaluate one of the assump-
tions of the t-test, which is whether the population variances for
the two groups are equal or not. Based on the Levene’s test of
Copyright © 2012 SciRes.
236
A. TAKIR ET AL.
Table 3.
Levene’s test of equality of error variances.
F df1 df2
AchScores 2.663 1 78
LoadScores 1.530 1 78
Note: *p < 0.05.
Table 4.
Multivariate and univariate analysis of variance.
MANOVA ANOVA
Variable F(2, 77) DV1 F(1, 78) DV2 F(1, 78)
Group 72.687* 118.262* 19.863*
Note: F ratios are Pillai’s Trace approximation, DV1 = Achievement Scores, DV2
= Cognitive Load Scores, *p < 0.05.
Table 5.
The mean of AAT z-scores and SRS z-scores.
Groups SRS z-score
(x-axis)
AAT z-score
(y-axis) Efficiency Value
Experimental –0.45 0.77 0.86
Control Group 0.45 –0.77 –0.86
Table 6.
Results of the independent samples t-test for efficiency values of
groups (n = 80).
Group M SD t df
Control –0.86 0.64
Experimental 0.86 0.84 10.339 78
Note: *p < 0.0.
equality of variances, it can be assumed that the homogeneity
of variances were not violated (p = 0.081 > 0.05) in the study.
Table 6 shows that the t-test results indicated that significant
mean difference on efficiency values between the control (M =
–0.86, SD = 0.64) and the experimental group (M = 0.86, SD =
0.84), t(78) = 10.339, p < 0.05. This finding implies that there
was significant difference between the efficiency values of the
students in the experimental and the control group.
Discussions and Conclusion
The results indicated that the students in experimental group
performed relatively well in AAT (Algebra Achievement Test)
compared to the control group. The mean score of the experi-
mental group was 41.18 and the mean score of the control
group was 14.88 (out of 80). The results of the SRS (Subjective
Rating Scale) indicated that the students in the experimental
group had lower SRS scores than those in the control group.
The mean SRS scores of the experimental and the control
groups were 3.58 and 5.05 (out of 9), respectively. These re-
sults are consistent with the studies which claim that an instruc-
tion is efficient if the learning results in such a way as to maxi-
mize learning and minimize the amount of cognitive load re-
quired (Paas & Van Merrienboer, 1994; Paas, 1992; Tuovinen
& Sweller, 1999).
Although the mean AAT score of the experimental group
was much higher than the control group, the mean scores of
AAT for both groups were actually low. One of the reasons for
these low achievement scores may be the deficiencies of the
students’ prior knowledge. The results indicated that the means
of student’s previous year mathematics scores for the experi-
mental group (M = 2.55 out of 5) and control groups (M = 2.98
out of 5) were not very high. According to the NCTM (2000)
Principles and Standards, the importance of the prior knowl-
edge was stressed as “students learn mathematics by connecting
new ideas to prior knowledge... Teachers should reveal stu-
dents’ prior knowledge and design experiences and lessons that
respond to, and build on, prior knowledge” (p. 18). Tatar and
Dikici (2008) reported that one of the reasons of students’ dis-
abilities in the mathematics is the deficiencies of their prior
knowledge. Therefore, it can be concluded that students’ prior
knowledge both in experimental and control groups effected
their achievement in the AAT.
Another reason would be the socio-economic background of
the students. As described in the Method section, the school
that the study was conducted was located in a poor socioeco-
nomic neighborhood. Research has consistently shown that
socioeconomic status have a negative influence on students’
achievement (Coleman, 1966; Engin-Demir, 2009; Heyneman
& Loxley, 1983; Savaş, Selma, & Adem, 2010; Tansel & Bir-
can, 2004). More specifically, TIMMS 98-99 data (Yayan,
2003) indicated that socioeconomic status is positively related
to mathematics achievement in Turkey. Ersoy and Erbaş (1998)
conducted a study to assess the Algebra achievement level of
the 7th grade students in Turkey. They found that Algebra
teaching was very problematic in the poor socio-economic
neighborhood. In other words, students in poor socio-economic
neighborhood had lower achievement scores in Algebra topics
compared to the medium or high socio-economic neighbor-
hood.
The other reason would be the school characteristics. The
physical conditions of the school that the study was conducted
were quite poor in terms of school facilities and class size. Pre-
vious research indicated that school characteristics in terms of
school facilities and class size had significant effect on the
academic achievement of students (Engin-Demir, 2009; Fuchs
& Woessmann, 2007).
As discussed previously, the mean AAT score of the experi-
mental group was much higher than the control group. This
result indicates the success of the CLT in poor socioeconomic
neighborhood. Therefore, it can be concluded that the instruc-
tion designed by the principles and effects of the CLT is suc-
cessful for teaching Algebra topics in poor socio-economic
neighborhood. Further, SRS scores of the students in experi-
mental group was much lower than the SRS scores of the stu-
dents in control group indicating the efficiency of the instruc-
tion designed by the principles and effects of the CLT in poor
socio-economic neighborhood.
The results of the MANOVA showed that there was a sig-
nificant effect. The effect size (0.65) claims the practical sig-
nificance of this result. It was indicated that scores of students
on AAT and SRS significantly differ according to the instruc-
tion and 65 percent of multivariate variance of the AAT and
SRS scores were associated with the group factor. In other
words, 65 percent of the variation of the AAT and SRS scores
was explained by the instruction developed by CLT principles.
Similar results concerning the significant effect of the CLT
Copyright © 2012 SciRes. 237
A. TAKIR ET AL.
principles had been found in several studies before (Atkinson et
al., 2003; Brunstein et al., 2009; Chandler & Sweller, 1992;
Kalyuga, Chandler, & Sweller, 2001; Kalyuga & Sweller, 2004;
Mousavi et al., 1995; Paas, 1992; Paas & Van Merrienboer,
1994; Sweller & Cooper, 1985; Tarmizi & Sweller, 1988; Zhu
& Simon, 1987). The results of the MANOVA suggested that
the instruction developed by the CLT principles can be used in
Algebra courses to increase achievement and decrease the cog-
nitive load.
To answer second research question, the mean of the AAT
z-scores, SRS z-scores, and the efficiency values for both the
experimental and the control groups were calculated and put in
the efficiency formula. The mean of AAT z-scores (–0.45),
SRS z-scores (0.77), and efficiency value (0.86) indicated that the
efficiency value of the experimental group was in the second
quadrant of the Cartesian Coordinate Plane, above the line E =
0. As discussed before, the sign of the efficiency scores deter-
mines the efficiency; as low or high, therefore, the efficiency
score of the experimental group indicated high efficiency. On
the other hand, the mean of AAT z-scores (0.45), SRS z-scores
(–0.77), and efficiency value (–0.86) indicate that the efficiency
value of the control group was in the fourth quadrant of the
Cartesian Coordinate Plane, below the line E = 0. The effi-
ciency score of the control group indicated low efficiency.
The results of the Independent samples t-test, t(78) = 10.339,
p < 0.05, indicated that there was a significant difference be-
tween the efficiency scores of students who were exposed to the
instruction designed by the CLT principles and those exposed
to the instruction recommended by MONE. This result is con-
sistent with the previous results that claimed that an instruction
designed by the CLT are more efficient than the instructions
designed by other instructional techniques (Gerven et al., 2003;
Kalyuga, Chandler, & Sweller, 2001; Van Gerven et al., 2002;
Van Merriënboer, et al., 2002). Previous studies on CLT prin-
ciples were mainly adapted to multimedia learning in Turkey,
although, the results are still being considered for their calcula-
tion of efficiency. The results of these studies conducted by
Kılıç (2006), Kablan and Erden (2008), and Sezgin (2009) in-
dicated that the instructions prepared by the CLT principles had
significantly high efficiency.
The results of this study showed that the instruction designed
by CLT principles on students’ achievement in Algebra topics
and cognitive load was effective. Further, the efficiency value
of the instruction indicated high efficiency. Therefore, the prin-
ciples and CLT effects that were used in this study can be rec-
ommended to practitioners to design effective instructional
environments.
The first and most important implication of an instruction
designed by the CLT principles is that effective instructional
environments depend on the human cognitive architecture sys-
tem. Therefore, the characteristics and the principles of this
system should be known well by the designers for effective
instructional environments.
How the knowledge is presented to the learners and in which
activities they engage in depend on the characteristics of the
WM, because the WM first processes information before it is
stored in the LTM. The major characteristic of the WM is its
limitation, both in terms of duration and capacity. The aim of
an instruction should be to ensure that the learners’ WM is not
overloaded. Therefore, the limitations of the WM should be
well known and the instruction should be designed according to
these limitations.
To manage the WM load and to facilitate LTM, the ECL
(caused by poorly designed instructional procedures that inter-
fere with schema acquisition) should be eliminated. Studying
worked example has been identified by the CLT as an effective
method of reducing the ECL. The learner can devote all the
available WM capacity to studying a worked-out solution and
constructing a schema to solve similar problems in LTM.
Worked example-problem pair instruction used in this study
helped to decrease the ECL of the learners. This type of lessons
alternated worked examples with a similar practice problem can
be used in different instructional contexts.
The guidance fading effect of the CLT can be used especially
for the novice learners who are most susceptible to cognitive
overload.
Instructional designers can use guidelines for instructional
methods that work best with low knowledge learners and with
high knowledge learners. Instructional designers should avoid
ECL when the learners are novice. As the learners develop
expertise, the instructional techniques should be adjusted ac-
cordingly.
The cognitive effort required to take notes reduces cognitive
capacity. Therefore, small content summaries or other supple-
mentary materials can be provided for the learners.
The teachers can be trained or educated about the basic prin-
ciples of the CLT in order to utilize the limited capabilities of
the learners’ WM.
REFERENCES
Atkinson, R., Renkl, A., & Merrill, M. (2003). Transitioning from
studying examples to solving problems: Effects of self-explanation
prompts and fading worked-out steps. Journal of Educational Psy-
chology, 95, 774-783. doi:10.1037/0022-0663.95.4.774
Brunstein, A., Betts, S., & Anderson, J. R. (2009). Practice enables
successful learning under minimal guidance. Journal of Educational
Psychology, 101, 790. doi:10.1037/a0016656
Chandler, P., & Sweller, J. (1991). Cognitive load theory and the for-
mat of instruction. Cognition and Instruction, 8, 293-332.
doi:10.1207/s1532690xci0804_2
Clark, R., Nguyen, F., & Sweller, J. (2005). Efficiency in learning:
Evidence-based guidelines to manage cognitive load. Sydney: Pfeif-
fer.
Coleman, J. S. (1966). Equality of educational opportunity. Education
Next, 6, 40-43.
Cowan, N. (2001). The magical number 4 in short-term memory: A
reconsideration of mental storage capacity. Behavioral and Brain
Sciences, 24, 87-114. doi:10.1017/S0140525X01003922
Dede, Y., & Argün, Z. (2003). Cebir, öğrencilere niçin zor gelmektedir.
Hacettepe Üniversitesi Eğitim Fakültesi Dergisi, 24, 180-185.
Dede, Y., Yalın, H. İ., & Argün, Z., (2002). 8th grade students’ mis-
takes on learning and misconceptions in variable concept. 5th Inter-
national Science and Mathematics Education Congress, Ankara.
Engin-Demir, C. (2009). Factors influencing the academic achievement
of the Turkish urban poor. International Journal of Educational De-
velopment, 29, 17-29. doi:10.1016/j.ijedudev.2008.03.003
Erbaş, A. K., & Ersoy, Y., (2005). 9th grade students’ performances
and misconceptions in solving equations. Proceedings of 5th Na-
tional Science Education Congress, Ankara
Field, A. P. (2009). Discovering statistics using SPSS. London: SAGE
Publications.
Fuchs, T., & Woessmann, L. (2007). What accounts for international
differences in student performance? A re-examination using PISA
data. Empirical Economics, 32, 433-464.
doi:10.1007/s00181-006-0087-0
Gerven, P. W. M., Paas, F., Merriënboer, J. J. G., Hendriks, M., &
Schmidt, H. G. (2003). The efficiency of multimedia learning into
Copyright © 2012 SciRes.
238
A. TAKIR ET AL.
old age. British Journal of Educational Psychology, 73, 489-505.
doi:10.1348/000709903322591208
Green, S., Salkind, N., & Jones, T. (1996). Using SPSS for Windows;
analyzing and understanding data. Englewood Cliffs, NJ: Prentice
Hall.
Hair, J. F., Anderson, R. E., Tatham, R. L., & Black, W. C. (1995).
Multivariate data analysis. Englewood Cliffs, NJ: Prentice-Hall.
Janssen, J., Kirschner, F., Erkens, G., Kirschner, P. A., & Paas, F.
(2010). Making the black box of collaborative learning transparent:
Combining process-oriented and cognitive load approaches. Educa-
tional Psychology Review, 22, 139-154.
doi:10.1007/s10648-010-9131-x
Kablan, Z., & Erden, M. (2008). Instructional efficiency of integrated
and separated text with animated presentations in computer-based
science instruction. Computers & Education, 51, 660-668.
doi:10.1016/j.compedu.2007.07.002
Kalyuga, S., Chandler, P., & Sweller, J. (2000). Incorporating learner
experience into the design of multimedia instruction. Journal of
Educational Psychology, 92, 126-136.
doi:10.1037/0022-0663.92.1.126
Kalyuga, S., Chandler, P., & Sweller, J. (2001). Learner experience and
efficiency of instructional guidance. Educational Psychology, 21, 5-
23. doi:10.1080/01443410124681
Kalyuga, S., & Sweller, J. (2004). Measuring knowledge to optimize
cognitive load factors during instruction. Journal of Educational
Psychology, 96, 534-558. doi:10.1037/0022-0663.96.3.558
Kılıç, E. (2006). Effects of parallel instructional design and task diffi-
culty level on university students’ achievement and cognitive load in
multimedia learning environment. Unpublished Doctorate Disserta-
tion, Ankara: Ankara University Educational Sciences.
Kirschner, P. (2002). Cognitive load theory: Implications of cognitive
load theory on the design of learning. Learning and Instruction, 12,
1-10. doi:10.1016/S0959-4752(01)00014-7
Kirschner, P., Sweller, J., & Clark, R. (2006). Why minimal guidance
during instruction does not work: An analysis of the failure of con-
structivist, discovery, problem-based, experiential, and inquiry-based
teaching. Educational Psychologist, 41, 75-86.
doi:10.1207/s15326985ep4102_1
Miller, G. (1956). The magical number seven, plus or minus two: Some
limits on our capacity for processing information. Psychological Re-
view, 63, 81-97. doi:10.1037/h0043158
MONE (2005). İlköğretim matematik dersi öğretim kılavuzu 6-8.
sınıflar. Ankara: MONE.
NCTM (2000). Principles and standards for school mathematics. New
York: National Council of Teachers of Mathematics.
Paas, F. (1992). Training strategies for attaining transfer of prob-
lem-solving skill in statistics: A cognitive-load approach. Journal of
Educational Psychology, 84, 429-434.
doi:10.1037/0022-0663.84.4.429
Paas, F., Renkl, A., & Sweller, J. (2003). Cognitive load theory: A
special issue of educational psychologist. Hillsdale, NJ: Lawrence
Erlbaum.
Paas, F., van Gog, T., & Sweller, J. (2010). Cognitive load theory: New
conceptualizations, specifications, and integrated research perspec-
tives. Educational Psychology Review, 22, 115-121.
doi:10.1007/s10648-010-9133-8
Paas, F., & Van Merrienboer, J. (1993). The efficiency of instructional
conditions: An approach to combine mental effort and performance
measures. Human Factors: The Journal of the Human Factors and
Ergonomics Society, 35, 737-743.
Paas, F., & Van Merrienboer, J. (1994). Variability of worked examples
and transfer of geometrical problem-solving skills: A cognitive-load
approach. Journal of Educational Psychology, 86, 122-122.
doi:10.1037/0022-0663.86.1.122
Paas, F., & Van Merriënboer, J. (1994). Instructional control of cogni-
tive load in the training of complex cognitive tasks. Educational
Psychology Review, 6, 351-371. doi:10.1007/BF02213420
Paas, F., Renkl, A., & Sweller, J. (2004). Cognitive load theory: In-
structional implications of the interaction between information
structures and cognitive architecture. Instructional Science, 32, 1-8.
doi:10.1023/B:TRUC.0000021806.17516.d0
Paas, F., Tuovinen, J., Tabbers, H., & Van Gerven, P. (2003). Cognitive
load measurement as a means to advance cognitive load theory.
Educational Psychologist, 38, 63-71.
doi:10.1207/S15326985EP3801_8
Paas, F., & Van Merriënboer, J. (1994). Instructional control of cogni-
tive load in the training of complex cognitive tasks. Educational
Psychology Review, 6, 351-371. doi:10.1007/BF02213420
Plass, J. L., Moreno, R., & Brünken, R. (2010). Cognitive load theory.
Cambridge: Cambridge University Press.
Pawley, D., Ayres, P., Cooper, M., & Sweller, J. (2005). Translating
words into equations: A cognitive load theory approach. Educational
Psychology, 25, 75-97. doi:10.1080/0144341042000294903
Renkl, A., & Atkinson, R. (2003). Structuring the transition from ex-
ample study to problem solving in cognitive skill acquisition: A cog-
nitive load perspective. Educational Psychologist, 38, 15-22.
doi:10.1207/S15326985EP3801_3
Sezgin, E. (2009). The effects of multimedia courseware designed
based on cognitive theory of multimedia learning on cognitive load,
performance levels and retention. Unpublished Dissertation Thesis.
Adana: Çukurova University.
Sweller, J. (1988). Cognitive load during problem solving: Effects on
learning. Cognitive Science, 12, 257-285.
doi:10.1207/s15516709cog1202_4
Sweller, J. (1994). Cognitive load theory, learning difficulty, and in-
structional design. Learning and Instruction, 4, 295-312.
doi:10.1016/0959-4752(94)90003-5
Sweller, J. (2010). Element interactivity and intrinsic, extraneous, and
germane cognitive load. Educational Psychology Review, 22, 123-
138. doi:10.1007/s10648-010-9128-5
Sweller, J., Ayres, P., & Kalyuga, S. (2011a). Cognitive load theory
(Vol. 1). New York: Springer. doi:10.1007/978-1-4419-8126-4
Sweller, J., Ayres, P., & Kalyuga, S. (2011b). Cognitive load theory in
perspective. Cognitive Load Theory, 237-242.
Sweller, J., & Chandler, P. (1991). Evidence for cognitive load theory.
Cognition and Instruction, 8, 351-362.
doi:10.1207/s1532690xci0804_5
Sweller, J., & Cooper, G. (1985). The use of worked examples as a
substitute for problem solving in learning algebra. Cognition and In-
struction, 2, 59-89. doi:10.1207/s1532690xci0201_3
Tabachnick, B. G., & Fidell, L. S. (2001). Using multivariate statistics
(4th ed.). Needham, MA: Allyn & Bacon.
Tabbers, H. K., Martens, R. L., & Merriënboer, J. J. G. (2004). Multi-
media instructions and cognitive load theory: Effects of modality and
cueing. British Journal of Educational Psychology, 74, 71-81.
doi:10.1348/000709904322848824
Tarmizi, R. A., & Sweller, J. (1988). Guidance during mathematical
problem solving. Journal of educational psychology, 80, 400-424.
doi:10.1037/0022-0663.80.4.424
Tatar, E., & Dikici, R. (2008). Matematik Eğitiminde Öğrenme Güç-
lükleri. Mustafa Kemal Üniversitesi Sosyal Bilimler Enstitüsü Der-
gisi, 5, 180-193.
Tobias, S., & Duffy, T. M. (2009). Constructivist instruction. Oxford:
Taylor & Francis.
Tuovinen, J., & Sweller, J. (1999). A comparison of cognitive load
associated with discovery learning and worked examples. Journal of
Educational Psychology, 91, 334-341.
doi:10.1037/0022-0663.91.2.334
Usiskin, Z. (1995). Why is algebra important to learn. American Edu-
cator, 19, 30-37.
Van Gerven, P., Paas, F., Van Merrinboer, J., & Schmidt, H. (2002).
Cognitive load theory and aging: Effects of worked examples on
training efficiency. Learning and Instruction, 12, 87-105.
doi:10.1016/S0959-4752(01)00017-2
Van Gog, T., Paas, F., & Van Merriënboer, J. (2004). Process-oriented
worked examples: Improving transfer performance through enhanced
understanding. Instructional Science, 32, 83-98.
doi:10.1023/B:TRUC.0000021810.70784.b0
Van Merrienboer, J. (1997). Training complex cognitive skills. Engle-
wood Cliffs, NJ: Educational Technology Publications.
Copyright © 2012 SciRes. 239
A. TAKIR ET AL.
Copyright © 2012 SciRes.
240
Van Merriënboer, J., & Kirschner, P. (2001). Three worlds of instruc-
tional design: State of the art and future directions. Instructional
Science, 29, 429-441. doi:10.1023/A:1011904127543
Van Merrienboer, J., Kirschner, P., & Kester, L. (2003). Taking the
load off a learner’s mind: Instructional design for complex learning.
Educational Psychologist, 38, 5-13.
doi:10.1207/S15326985EP3801_2
Van Merriënboer, J., Schuurman, J., De Croock, M., & Paas, F. (2002).
Redirecting learners’ attention during training: Effects on cognitive
load, transfer test performance and training efficiency. Learning and
Instruction, 12, 11-37. doi:10.1016/S0959-4752(01)00020-2
Van Merriënboer, J., & Sweller, J. (2005). Cognitive load theory and
complex learning: Recent developments and future directions. Edu-
cational Psychology Review, 17, 147-177.
doi:10.1007/s10648-005-3951-0
Yayan, B. (2003). A cross-cultural comparison of mathematics achieve-
ment in the third international mathematics and science study-repeat
(TIMSS-R). Ankara: Middle East Technical University.
Zhu, X., & Simon, H. A. (1987). Learning mathematics from examples
and by doing. Cognition and Instruction, 4, 137-166.
doi:10.1207/s1532690xci0403_1