The development of a spelling assessment tool informed by Triple Word Form Theory.
Daffern, Tessa ; Mackenzie, Noella Maree ; Hemmings, Brian 等
Introduction
Spelling is one of the essential mechanics of the written English
language. It is the visible representation of 'word-level language
using written symbols in conventional sequences (orthography) that
represent speech sounds (phonology) and word parts that signal meaning
and grammar (morphology)' (Garcia, Abbott & Berninger, 2010, p.
63). Much of the English spelling system is also etymologically complex
as it derives from culturally and historically diverse linguistic
origins (Invernizzi & Hayes, 2004; Venezky, 2004).
Scholarly research in spelling has not yet reached consensus
regarding the trajectory of spelling knowledge in school-aged children.
This ambiguity has perpetuated a need to develop assessment systems that
reflect evolving perspectives of spelling development. Effective
spelling assessment systems should provide informative, reliable and
valid data and be culturally contextualised; however, spelling
assessment tools currently used in many classrooms do not necessarily
provide teachers with comprehensive, valid and reliable data. This
article reports on a study that developed and tested an innovative,
valid and reliable spelling assessment tool that could be used by middle
and upper primary school teachers, as well as educational researchers.
This tool, referred to as the Components of Spelling Test (CoST), was
designed to address a gap in pedagogy and in educational research
methodology. Of particular significance, the CoST is the first spelling
assessment tool of its kind to be developed and tested within an
Australian context.
A focus on spelling
It is well established that spelling is an important dimension of
writing and a fundamental part of being literate (Berninger, Abbott,
Nagy & Carlisle, 2010; Devonshire & Fluck, 2010). If students
have inadequate spelling skills, they may need to devote conscious
attention to the task of spelling rather than on other dimensions
required for composing quality texts (Hutcheon, Campbell & Stewart,
2012; Puranik & Al Otaiba, 2012). Research also suggests that as
primary school students progress through schooling, they may become less
willing to take risks with vocabulary choice when writing, particularly
if they are unsure of a word's spelling (Lowe & Bormann, 2012).
In addition, proficiency in spelling is known to support metalinguistic
skills, such as phonological awareness (Ehri, 1985) and morphological
awareness (Nagy, Berninger & Abbott, 2006). These skills, in turn,
positively influence writing competence (Martello, 2001), confidence and
general enjoyment and fluency in reading (Perfetti, 1997; Treiman,
1998). Indeed, Templeton and Morris (1999, p. 103) describe spelling
knowledge as the 'engine that drives efficient reading and
writing'.
The twenty-first century is characterised by rapid developments in
technology and an increasing reliance on the use of devices and software
designed to facilitate communication (Zedda-Sampson, 2013). As writers
navigate multi-modes of text in an age where instant communication
intensifies, spelling competence becomes increasingly essential. Social
networking and digital text messaging has generated an additional
language known as 'texting' (Bushnell, Kemp & Martin,
2011; Zedda-Sampson, 2013), and this places demands on writers to
consciously control and manipulate their spelling in order to
communicate in a range of contexts. Applying and adapting spelling
systems to different social and cultural contexts requires autonomous
and critical spelling.
Learning to spell
Research has offered various perspectives on the nature of spelling
development; however, evidence has not been substantive enough to
provide consensus on whether spelling develops in progressive and
distinct stages, or whether spelling develops in more complex ways. The
analysis of spelling errors students make have enabled stage theorists
(Bear & Templeton, 1998; Cataldo & Ellis, 1988; Ehri, 1985;
Frith, 1980; Gentry, 2012; Read, 2009) to produce a linguistic index (a
list of linguistic features, such as initial consonants and digraphs)
that has subsequently led to the categorisation of spelling development
into distinct and sequential stages. For example, according to Gentry
(2000, p. 324), students progress through five distinct stages of
spelling, namely, 'precommunicative', 'semiphonetic,
'phonetic', 'transitional' and 'correct'
(or 'conventional'). On the other hand, Bear, Invernizzi,
Templeton and Johnston (2012, p. 9) label the five stages as
'emergent', 'letter-name', 'within word',
'syllables and affixes' and 'derivational'.
Proponents of stage theories conceptualise spelling development using
different terminology; but, these theories are broadly characterised by
similar qualitatively descriptive categories or stages. Developmental
spelling stages derive from 'Piagetian theory and the notion that
aspects of cognitive development proceed by way of qualitative
stage-like change' (Gentry, 2000, p. 319). According to stage
theories, 'spelling difficulties are viewed as an inability to move
on to the next stage' (Kohnen, Nickels & Castles, 2009, p.
116). Diagnostic assessments that are informed by stage-theories, such
as the Words their Way Spelling Inventories (Bear et al., 2012) and
Ganske's (2000) assessment model, are used to analyse
students' spelling errors and classify them according to a
particular stage of development. Although stage-oriented assessment
tools take into account phonological, orthographic and morphological
aspects within words, accuracy in phonological, orthographic and
morphological encoding is assumed to progress sequentially. Within a
stage-like assessment framework, instruction for the student then
focuses on assisting the student to progress to the next stage.
However, Ehri (2005) describes learning to read and spell sight
words according to sequential 'phases', rather than
'stages', with the intention to avoid the stringent assumption
associated with developmental stages. According to Ehri, while learning
to read and spell may occur over four 'successive' phases
(2005, p. 176), namely 'prealphabetic', 'partial
alphabetic', 'full alphabetic', and finally,
'consolidated alphabetic' (Ehri, 2005, 2013), she acknowledges
that progression through the phases may indeed overlap somewhat.
Discrepancies in views regarding how children learn to spell highlight
the need for further research.
Although stage theories offer a sequential framework that may be
useful for teachers planning and implementing teaching and learning
experiences, the need to reconsider spelling assessment is critical as
evidence of non-linear models of spelling development is mounting.
Indeed, converging evidence in support of non-linear models of spelling
development (see, for example, Garcia et al., 2010; Perfetti & Hart,
2002) has emerged in the last decade or so, suggesting that the
development of spelling may be far more complex than stage theorists
assumed in the 1970s and 80s. The view that young students are capable
of drawing on and coordinating phonological, orthographic and
morphological skills from the beginning of spelling development, and
that these students gain increasing explicit control over these skills
(Devonshire & Fluck, 2010; Garcia et al., 2010; Rittle-Johnson &
Siegler, 1999) is very different from the view held by stage theorists,
who assume that spelling progresses sequentially from phonology to
orthography to morphology (Bear et al., 2012; Gentry, 2012).
Accurate phonological representations reflect knowledge of how to
segment spoken words into the smallest units of sound within a word, as
well as knowledge of sound to letter correspondences in words (Bear et
al., 2012; Ganske, 2000). Processing and reproducing orthographic
features within words requires sensitivity to letter sequences, or
clusters of letters within a word, rather than visual features (or
shapes) of individual letters (Bahr, Silliman, Berninger & Dow,
2012). For example, when students have developed high orthographic
sensitivity, they have come to remember /ough/ as a whole unit (as in
the word, brought) and can automatically encode the unit, as opposed to
laboriously encoding each individual grapheme as /o-u-g-h/. These
students are also highly aware that the letter ordering, or sequencing,
within the cluster /ough/ is plausible, and that /uohg/, for example, is
not. Developing this latter kind of orthographic sensitivity is referred
to by Treiman and Kessler (2006, p. 642) as 'statistical
learning'. Morphological knowledge reflects an individual's
capacity to reflect, analyse and manipulate the morphemic elements in
words (Carlisle, McBride-Chang, Nagy & Nunes, 2010).
Theorists who advocate a non-linear perspective of spelling
development argue that phonology, orthography and morphology develop and
interact in parallel throughout the primary school years. Triple Word
Form Theory (TWFT) represents a non-linear stance and has been validated
in a series of brain imaging studies (Berninger et al., 2010; Richards,
Aylward, Berninger, et al., 2006) and behavioural studies (Berninger,
Raskind, Richards, Abbott & Stock, 2008; Garcia et al., 2010; Nagy
et al., 2006). Originating from studies conducted on samples of
individuals diagnosed with dyslexia and employing instructional methods
and brain imaging (Garcia et al., 2010; Richards, Aylward, Field, et
al., 2006; Richards, Berninger, Winn, et al., 2009), TWFT resonates
closely with Perfetti and Hart's (2002) Lexical Quality Hypothesis
(LQH). While Perfetti and Hart (2002) propose that spelling development
is dependent on the integration and interaction of three closely
connected constituents: orthography, phonology, and semantics (Perfetti
& Hart, 2002), their LQH is less grounded by research than TWFT. For
instance, several studies in support of TWFT have involved comparisons
of individuals with and without dyslexia diagnoses, providing converging
evidence that unique and common brain regions are activated when
individuals are engaged in tasks, distinguished as either phonological,
orthographic or morphological in nature (Berninger & Abbott, 2010;
Berninger et al., 2010; Garcia et al., 2010; Richards, Aylward,
Berninger, et al., 2006; Richards, Aylward, Field, et al., 2006). TWFT
assumes that phonological, orthographic, and morphological word forms
are involved in learning to spell from the early years of learning to
write, and that changes occur in the ways in which these linguistic
forms interact (Richards, Berninger & Fayol, 2009). Moreover,
according to TWFT, increasing efficiency and autonomy in the
coordination of the three linguistic word forms occurs over time,
largely, as a result of instructional priorities and approaches
(Berninger et al., 2010).
Assessment of spelling
Further research is still needed to describe and explain the
complex nature of spelling development; however, it seems sensible and
logical to develop assessment systems that consider both non-linear and
stage-like views of spelling development. Popular standardised
assessment tools, such as the South Australian Spelling Test (Westwood,
2005), offer a range of words which implicitly encompass phonological,
orthographic and morphological word forms and increase in difficulty;
however, they do not provide teachers with the kind of diagnostic
information needed to support students' improvement in spelling. An
assessment system, developed in the US through qualitative analysis of
students' written compositions, known as the Phonological,
Orthographic, and Morphological Assessment of Spelling (POMAS) (Bahr et
al., 2012), conceptualises spelling into the three word forms. However,
the POMAS has its limitations. Unlike the CoST, the POMAS does not
provide teachers with a statistically reliable, readily available,
efficient and user-friendly way to measure students' phonological,
orthographic and morphological spelling, based on words spelled to
dictation. The underlying purpose of the present study was to address
this issue. Indeed, it has been argued that existing measures of
spelling achievement are 'not sufficiently structured or
standardised to provide the reliable, sensitive data that teachers need
to plan instruction' (Al Otaiba & Hosp, 2010, p. 4). Moreover,
systematic and standardised spelling error analysis has the potential to
yield much richer data than simply examining student responses in terms
of words being correct or incorrect (Al Otaiba & Hosp, 2010; Bear et
al., 2012; Sharp, Sinatra & Reynolds, 2008).
The spelling of single words presented orally in the context of a
sentence, in contrast to the spelling of words in written composition,
provides a reliable medium from which to measure spelling achievement
(Kohnen et al., 2009). Indeed, when students compose whole written
texts, such as narratives, they need to manage the spelling task with
several other important cognitive processes, including planning,
creating cohesive syntactic structures, selecting vocabulary, reviewing
and monitoring (Mackenzie, Scull & Munsie, 2013). With this
assumption in mind, a dictation spelling test provides an appropriate
medium to measure specific knowledge of the components of spelling, as
it reduces other potential interferences which may subsequently disguise
a student's knowledge capacity of the spelling system.
It has been argued that dictation spelling tests alone do not
assure comprehensive profiles of students' knowledge of the
spelling system (Hammond, 2004; Westwood, 2005). Analyses of
students' linguistic errors present in prescribed words may yield
an effective and integral source of feedback to teachers or educational
researchers; however, if accompanied by other methods of assessment,
feedback is undoubtedly enhanced. Additional methods of spelling
assessment should not be disregarded, and these may include reflective
conversations or self-reports by students (Critten, Pine & Messer,
2013; Devonshire & Fluck, 2010), and analyses of linguistic errors
present in students' contextualised written compositions (Bahr et
al., 2012; Devonshire & Fluck, 2010).
The study
The study described in this paper derives from research which took
place during the initial phases of a larger project examining the
spelling competence of 1 200 students aged between eight and 12 years.
The main purpose of the study was to develop and test the reliability
and validity of the CoST.
Method
The development and testing of this assessment tool is best
described in two main stages. The first stage involved designing a tool
that was informed by current literature on spelling development and
assessment. Words were selected by reviewing commonly used assessment
tools, such as the South Australian Spelling Test (Westwood, 2005) and
the inventories from Words Their Way Spelling (Bear et al., 2012).
Consideration was also given to common spelling errors made by students
in the National Assessment Program of Literacy and Numeracy (NAPLAN)
Language Conventions Test (Willett & Gardiner, 2009), as well as
lists of frequently used words and 'demon words' (Roberts,
2001, p. 52). The intention was to compile a preliminary list of words
characterised by diverse lexical complexity. This compilation exercise
resulted in about 90 words being identified as having potential utility.
The chief researcher then determined and defined appropriate
linguistic features present in each word by considering the linguistic
index associated with stage theory. In particular, the commercially
available spelling inventories developed by stage theorists (see, for
example, Bear et al., 2012) were used as reliable and valid exemplars
from which to score and analyse spelling errors (Sterbinsky, 2007).
Specific linguistic features within each word were then aligned to the
three overarching components that underpin TWFT (that is, phonological,
orthographic and morphological). The tentative list of words and their
corresponding dictation sentences were further refined in terms of their
linguistic features, and condensed as part of an expert review process
before proceeding to the second research stage.
The second and final stage of the study involved (i) testing the
draft version of the CoST in several school contexts, and (ii) an
empirical analysis and refinement of the CoST. The following sections of
this paper consider this stage of the study.
Sample
The second stage of the study focussed on students in Year 3 and
Year 5 from four schools in the Australian Capital Territory (ACT),
Australia. Representation from all three school sectors (that is,
public, catholic and independent schools) was included to ensure the
sample was broadly representative of ACT primary schools (Johanson &
Brooks, 2010). Approval to conduct the study was granted by the
university's Human Research Ethics Committee, the ACT Catholic
Education Office and the ACT Education and Training Directorate.
Informed written consent was also obtained from the participating school
principals, teachers, students and their parents prior to conducting the
school-based research. Using convenience sampling, 163 students from
Year 3 and 176 students from Year 5 were invited to take part in the
study. Of these, 94 students from Year 3 and 97 students from Year 5
classes agreed to take part in the study.
Test administration
The chief researcher administered the CoST to participating
students during regular class times and at their respective school sites
during Third Term, 2013. For each word in the CoST, students were
required to listen to the word first, which was then repeated in the
context of a sentence, and then restated one more time. Students'
spelling responses were collected and then individually marked by the
chief researcher. It needs to be kept in mind that some of the words had
more than one linguistic feature (item) to be assessed.
Reliability and validity analyses
To test the reliability of each of the three subscales or
components (namely, Phonological Component, Orthographic Component, and
Morphological Component) the raw scores were standardised and then
analysed using estimates of item difficulty and internal consistency.
Data from the Year 3 and Year 5 cohorts were analysed separately.
The item difficulty index was calculated on each subscale by
identifying the percentage of Year 3 students who accurately spelled
each feature (item) of a word. The same procedure was then repeated with
the Year 5 data. Higher percentages (approximately 80-90%) in the item
difficulty index identified items that were easier, whereas lower
percentages (approximately 20% or lower) identified items that were more
difficult. Overall, the scores were lower for Year 3 than they were for
Year 5, as presented in Table 1.
The distribution of scores, for each subscale, was then checked by
using the Descriptives command on SPSS, Version 20 (Buckingham &
Saunders, 2004). An inspection of the spread of scores helped to
determine if all items in the subscales were adequate measures. If the
results for an item are too similar, it is difficult to know whether the
item is adequate or whether the subscale actually lacks variability
(Macmillan & Schumacher, 2006).
Four other items, and their related words, in the phonological
measure were considered for deletion due to some evidence of ceiling
effects. These included short vowels and final consonants in
one-syllable words such as tag, stick and gum. Although typically
developing students acquire knowledge of beginning and final consonant
letters and short vowel sounds from the early stages of learning to read
and write (Paris, 2005), a decision to keep these items was made to
offer participating students an opportunity to begin the spelling test
with a couple of one-syllable words containing common sound-letter
correspondences. Such an approach was adopted to help boost the
confidence of the respondents and make them feel more at ease in a test
situation.
The maximum score in the revised version of the phonological
measure was 29, recorded in Year 3, and 31 in Year 5. The respective
means were 20.9 and 23.0, as indicated in Table 1. There was also some
evidence of ceiling effects in the orthographic measure. A marked
improvement was evident from Year 3 to Year 5 in the morphological
subscale, and respective scores ranged from 2 to 28 and 1 to 39
indicating the greater difficulty of this area.
A test of scale reliability was also undertaken. This test examines
internal consistency by determining whether the items included to
measure each construct in all three subscales had good overall
inter-correlation. To test internal consistency requires a calculation
using Cronbach's alpha to ascertain how homogenous the items of
each subscale are (Colman & Pulford, 2008). Cronbach's alpha is
an appropriate statistic as the items in the CoST are dichotomous (e.g.
score for each item is either correct or incorrect). According to Muijs
(2004), obtaining a strong case for reliability requires an alpha of
over 0.75. Cronbach's alpha on each subscale was calculated
separately by school year, using the Reliability Analysis command in
SPSS, Version 20.
There was a total of 111 items in the original version of the CoST;
however, some items yielded either negative or low correlations or had
an inadequate spread of scores. Consequently, ten items, and their
related words, were deleted from the instrument. For example, one item
classified in the phonological measure as a consonant digraph was deemed
problematic (/sh/ in the word wish) as it did not yield a statistically
reliable result in Year 3. As a consequence, the word 'wish'
was removed from the revised list. In the orthographic measure, three
items were deleted to strengthen the overall reliability. These included
a common long vowel (/o/ in rope), a diphthong (/ow/ in shower), and an
unaccented final syllable (/er/ in shower). In the morphological
measure, six items were also deleted as they improved the overall alpha.
Items included a derivational suffix (/able/ in innumerable), a
homophone (torque), an assimilated prefix (/nn/ in innumerable), and
three root words (/arch/ in monarchy; /psych/ in psychology; and /equi/
in equilibrium).
With a revised total of 101 items, the internal consistency results
of the finalised CoST were strong, as indicated in Table 2.
Content validity was also addressed by designing an instrument
based on current literature regarding the development of spelling and on
the nature of the Australian English spelling system. The structure of
the spelling test instrument closely aligns with the components of
spelling that define TWFT. It also utilises an error-analysis technique
that is characteristic of the spelling inventories provided by stage
theorists (see, for example, Bear et al., 2012). Consulting colleagues
during an expert review process further enhanced the validity of the
CoST (Muijs, 2004).
The revised structure of the CoST
The CoST was designed to efficiently assess a whole class of
students at one time. It is a dictation spelling test consisting of 70
words; however, what sets this test apart from other commonly used
dictation spelling tests designed specifically within an Australian
context, such as the South Australian Spelling Test (Westwood, 2005), is
in the way it is scored and analysed. Each word is analysed in terms of
specific linguistic errors that may be made by the student and is
informed by TWFT. In the CoST, each linguistic error is scored and
categorised into one of three subscales: Phonological Component;
Orthographic Component; and Morphological Component.
Across the three components (subscales), there are a total of 15
linguistic constructs (spelling features) and 101 individual items,
across 70 words. Table 3 presents a summary of the final version of the
CoST's structure.
The Phonological Component
The Phonological Component in the CoST is designed to measure
knowledge of the phono-graphic (speech sound to alphabetic letter)
representation of initial and final consonants, regular short medial
vowels, common digraphs, and medial blends within polysyllabic words
(Richards, Aylward, Berninger, et al., 2006). The CoST is the first
assessment tool that considers phonological knowledge within the context
of complex polysyllabic words. Table 4 presents a summary of the four
constructs that comprise the Phonological Component, each of which
measures the spelling accuracy of specific phonological features within
words.
The Orthographic Component
The Orthographic Component in the CoST is designed to measure
knowledge of correct (ortho) letter sequences within written (graphy)
words. The English orthographic system relies on 26 letters of the
alphabet to represent 44 phonemes, or speech sounds (Bear et al., 2012).
Therefore, a single phoneme is sometimes spelled with different letter
sequences or combinations. The Orthographic Component in the CoST
considers the visual representation of conventional letter sequences as
a measure of orthographic knowledge. In the CoST, the Orthographic
Component consists of five constructs (see Table 5), each of which
measures the spelling accuracy in specific orthographic features within
words.
The Morphological Component
The Morphological Component in the CoST is designed to measure
knowledge of morphemic elements within words. A morpheme is the smallest
unit of meaning and includes prefixes, suffixes and roots (Ganske,
2000). A prefix is a unit of meaning that attaches to the beginning of a
base word or word root, while a suffix is a unit of meaning that
attaches to the end of a base word or root. Roots are examples of
morphemes that are etymologically significant, and most commonly
originate from the Latin and Greek languages (Bear et al., 2012; Ganske,
2000). There are two types of morphemes in the English language,
distinguished as either bound or free (Ganske, 2000): A bound morpheme
is the smallest unit of meaning that cannot be used as an isolated word.
Bound morphemes commonly include prefixes and suffixes such as pre-,
un-, re-, -ful, and -es. A free morpheme is commonly referred to as a
base word and is the smallest unit of meaning that can stand alone, such
as shout and march.
As morphemes are combined in various ways to express specific
meanings or to function in certain grammatical roles (Carlisle et al.,
2010), spelling is often determined by particular morphemic elements in
words. Some words might contain one free morpheme and one bound morpheme
(shout-ed and knotted), while many compound words contain two free
morphemes (sun-shine). Other words may contain several morpheme
combinations (in-cred-ible).
Six constructs are included in the Morphological Component to
measure the spelling accuracy of specific morphemic elements within
words, as summarised in Table 6.
Discussion
Reconciling the dichotomy of stage theories and nonlinear models of
spelling development is possible to some degree. By drawing on the
assumptions that underlie TWFT while simultaneously adopting the methods
of spelling error analysis developed by stage theorists, the CoST
appears to provide a reliable and valid tool to measure individual
differences in specific phonological, orthographic and morphological
skills associated with the Australian-English spelling system.
The CoST's innovation is in its capacity to interrogate
student knowledge of the spelling system without confining an
individual's spelling achievement into one particular stage of
development. The CoST's linguistic features account for the
likelihood that students in the middle and upper primary school years
may be encouraged to experiment with complex polysyllabic words and
homophones when they engage in the craft of writing, and that
phonological, orthographic and morphological processing may be
subsequently affected. Some students may demonstrate accurate
phonological processing when spelling one or two syllable words;
however, further research is needed to determine whether a break-down of
phonological processing occurs when the same students attempt to write
more complex, polysyllabic words. Spelling assessment systems need to
include valid and reliable measures of phonological, orthographic and
morphological processing, beyond the spelling of monosyllabic words, yet
it is intriguing that the use of polysyllabic words has been overlooked
in existing, commonly used spelling assessment tools, particularly with
respect to phonological processing. Additionally, the inclusion of
homophones as a construct in current spelling assessment tools has been
largely neglected, yet this linguistic feature seems to present ongoing
challenges to school-aged students (Kohnen et al., 2009). Given the
substantial number of homophones in the English language, its relative
absence as a measurable construct of morphological significance is
surprising.
The systematic analysis of linguistic errors, produced within
individual words, yields rich data; however, the ways in which
linguistic features are conceptualised may challenge existing
assumptions about the nature of spelling development. The CoST
categorises spelling errors as phonological, orthographic and
morphological, rather than as developmental stages, and this is starkly
different to a stage-like approach to spelling error analysis. Stage
theorists assume that spelling develops in a linear manner, yet emerging
research contests this view. A fundamental implication of the CoST is
that current perspectives of spelling development can now be further
contested. Curriculum developers and educators may need to consider that
an individuals' understanding and application of the English
spelling system may not necessarily follow a sequential or linear path.
Primary school students experiment with increasingly complex vocabulary
as they learn to write and read. As they do, their phonological,
orthographic and morphological processing is likely to intensify in
efficiency and autonomy.
The CoST presents a new methodological foundation for further
investigation into students' spelling achievement; however, this
tool is not intended to be used in isolation or as a replacement for
other assessment tools, both in the classroom and in the scholarly
research context. Supplementing the CoST with additional measures is
critical to capturing the complex nature of spelling development. For
example, an assessment may also include tasks that require students to
proofread and edit spelling errors that are presented in a given text.
In addition, it will always be important to include analysis of
students' spelling within the context of written compositions, as
it is in this situation that we see the application of their developing
spelling competence. It also needs to be noted that, as with many
dichotomous measures, caution is needed in administrating the CoST to
the same student/s, as it is possible that familiarity with the dictated
words may subsequently invalidate post-test results.
Conclusion
The CoST has the potential to offer rich insights into the spelling
skills of middle and upper primary school-aged students. Although the
findings of this study cannot be generalised to the broader primary
school student population, the CoST can be used for future examination
of spelling in other contexts. Indeed, this tool has since become
pivotal in a large-scale project involving 1 200 students, providing
evidence of criterion-related validity, among other findings, yet to be
published. Replicating the present study using a set of parallel items,
with other student populations, and with the inclusion of an item
discrimination analysis, will further validate and enhance the utility
of the CoST. There is also scope to apply the subscales and constructs
of the CoST as a framework for analysing spelling errors present in
students' written compositions. The cultural significance of this
assessment tool lies in its relevance to a twenty-first century
Australian education context. The CoST offers a means from which to
begin uncovering the complexities of students' knowledge of the
English spelling system in ways that may not have previously been
anticipated.
References
Al Otaiba, S. & Hosp, J. (2010). Spell it out: The need for
detailed spelling assessment to inform instruction. Assessment for
Effective Intervention, 36(1), 3-6.
Bahr, R., Silliman, E., Berninger, V. & Dow, M. (2012).
Linguistic pattern analysis of misspellings of typically developing
writers in grades 1-9. Journal of Speech, Language, and Hearing
Research, 55, 1587-1599.
Bear, D.R., Invernizzi, M., Templeton, S. & Johnston, F.
(2012). Words their way: Word study for phonics, vocabulary, and
spelling instruction (5th ed.). Upper Saddle River, NJ: Pearson
Education, Inc.
Bear, D.R. & Templeton, S. (1998). Explorations in
developmental spelling: Foundations for learning and teaching phonics,
spelling and vocabulary. The Reading Teacher, 52(3), 222-242.
Berninger, V. & Abbott, R. (2010). Listening comprehension,
oral expression, reading comprehension, and written expression: Related
yet unique language systems in Grades 1, 3, 5, and 7. Journal of
Educational Psychology, 102(3), 635-651.
Berninger, V., Abbott, R., Nagy, W. & Carlisle, J. (2010).
Growth in phonological, orthographic, and morphological awareness in
grades 1 to 6. Journal of Psycholinguistic Research, 39(2), 141-163.
doi: 10.1007/s10936-009-9130-6
Berninger, V., Raskind, W., Richards, T., Abbott, R. & Stock,
P. (2008). A multidisciplinary approach to understanding developmental
dyslexia within working-memory architecture: Genotypes, phenotypes,
brain, and instruction. Developmental Neuropsychology, 33(6), 707-744.
doi: 10.1080/87565640802418662
Buckingham, A. & Saunders, P. (2004). The survey methods
workbook. From design to analysis. Cambridge, UK: Polity Press.
Bushnell, C., Kemp, N. & Martin, F.H. (2011). Text-messaging
practices and links to general spelling skill: A study of Australian
children. Australian Journal of Educational & Developmental
Psychology, 11, 27-38.
Carlisle, J.F., McBride-Chang, C., Nagy, W. & Nunes, T. (2010).
Effects of instruction in morphological awareness on literacy
achievement: An integrative review. International Reading Association,
45(4), 464-487.
Cataldo, S. & Ellis, N. (1988). Interactions in the development
of spelling, reading and phonological skills. Journal of Research in
Reading, 11(2), 86-109.
Colman, A. & Pulford, B. (2008). A crash course in SPSS for
Windows (4th ed.). West Sussex: Wiley-Blackwell.
Critten, S., Pine, K. & Messer, D. (2013). Revealing
children's implicit spelling representations. British Journal of
Developmental Psychology, 31(2), 198-211. doi: 10.1111/ bjdp.12000
Daffern, T. & Mackenzie, N. (2015). Building strong writers:
Creating a balance between the authorial and secretarial elements of
writing. Literacy Learning: the Middle Years, 23(1), 23-32.
Devonshire, V. & Fluck, M. (2010). Spelling development:
Fine-tuning strategy-use and capitalising on the connections between
words. Learning and Instruction, 20, 361-371.
Ehri, L.C. (1985). Learning to read and spell. Paper presented at
the Annual Meeting of the American Educational Research Association,
Chicago.
Ehri, L.C. (2005). Learning to read words: Theory, findings, and
issues. Scientific Studies of Reading, 9(2), 167-188. doi:
10.1207/s1532799xssr0902_4
Ehri, L.C. (2013). Orthographic mapping in the acquisition of sight
word reading, spelling memory, and vocabulary learning. Scientific
Studies of Reading, 18(1), 5-21. doi:10.1080/10888438.2013.819356
Frith, U. (1980). Cognitive process in spelling. London: Academic
Press.
Ganske, K. (2000). Word journeys: Assessment-guided phonics,
spelling, and vocabulary instruction. New York: The Guilford Press.
Garcia, N., Abbott, R. & Berninger, V. (2010). Predicting poor,
average, and superior spellers in grades 1 to 6 from phonological,
orthographic, and morphological, spelling, or reading composites.
Written Language & Literacy, 13(1), 61-98.
Gentry, J.R. (2000). A retrospective on invented spelling and a
look forward. The Reading Teacher, 54(3), 318-332.
Gentry, J.R. (2012). An analysis of developmental spelling in GNYS
AT WRK. In D. Wyse (Ed.), Literacy Teaching and Education (Vol. 3, pp.
347-357). London: SAGE Publications.
Hammond, L. (2004). Getting the right balance: Effective classroom
spelling instruction. Australian Journal of Learning Disabilities, 9(3),
11-18. doi:10.1080/19404150409546769
Hutcheon, G., Campbell, M. & Stewart, J. (2012). Spelling
instruction through etymology: A method of developing spelling lists for
older students. Australian Journal of Educational & Developmental
Psychology, 12, 60-70.
Invernizzi, M. & Hayes, L. (2004). Developmental spelling
research: A systematic imperative. Reading Research Quarterly, 39(2),
216-228.
Johanson & Brooks, G. (2010). Initial scale development: Sample
size for pilot studies. Educational and Psychological Measurement,
70(3), 394-400. doi:10.1177/0013164409355692
Kohnen, S., Nickels, L. & Castles, A. (2009). Assessing
spelling skills and strategies: A critique of available resources.
Australian Journal of Learning Difficulties, 14(1), 113-150.
Lowe, K. & Bormann, F. (2012). U-Can Write: Working with
Struggling Writers. Literacy Learning: The Middle Years, 20(2), 22-28.
Mackenzie, N.M., Scull, J. & Munsie, L. (2013). Analysing
writing: The development of a tool for use in the early years of
schooling. Issues In Educational Research, 23(3), 375-393.
Macmillan, J. & Schumacher, S. (2006). Research in education:
Evidence-based inquiry (6th ed.). Sydney: Pearson, Allyn & Bacon.
Martello, J. (2001). Talk about writing: Metalinguistic awareness
in beginning writers. Australian Journal of Language and Literacy,
24(2), 101.
Muijs, D. (2004). Validity, reliability and generalisability. Doing
quantitative research in education with SPSS (pp. 64-76). London: Sage
Publications.
Nagy, W., Berninger, V. & Abbott, R. (2006). Contributions of
morphology beyond phonology to literacy outcomes of upper elementary and
middle-school students. Journal of Educational Psychology, 98(1),
134-147.
Paris, S.G. (2005). Reinterpreting the development of reading
skills. Reading Research Quarterly, 40(2), 184-202.
Perfetti, C. (1997). The psycholinguistics of spelling and reading.
In C. Perfetti, L. Rieben & M. Fayol (Eds.), Learning to Spell:
Research, Theory, and Practice Across Languages (pp. 21-38). Mahwah, NJ:
Lawrence Erlbaum Associates.
Perfetti, C. & Hart, L. (2002). The lexical quality hypothesis.
In L. Verhoeven, C. Elbro & P. Reitsma (Eds.), Precursors of
Functional Literacy (pp. 189-213). Philadelphia: John Benjamins
Publishing Company.
Puranik, C.S. & AlOtaiba, S. (2012). Examining the contribution
of handwriting and spelling to written expression in kindergarten
children. Reading and Writing, 25(7), 1523-1546.
Read, C. (2009). Learning to use alphabetic writing. In R. Beard,
D. Myhill, J. Riley & M. Nystrand (Eds.), The SAGE handbook of
writing development (pp. 260-270). Los Angeles: SAGE.
Richards, T., Aylward, E., Berninger, V., Field, K., Grimme, A.,
Richards, A. & Nagy, W. (2006). Individual fMRI activation in
orthographic mapping and morpheme mapping after orthographic or
morphological spelling treatment in child dyslexics. Journal of
Neurolinguistics, 19(1), 56-86.
Richards, T., Aylward, E., Field, K., Grimme, A., Raskind, W.,
Richards, A. & Berninger, V. (2006). Converging evidence for triple
word form theory in children with dyslexia. Developmental
Neuropsychology, 30(1), 547-589. doi:10.1207/s15326942dn3001_3
Richards, T., Berninger, V. & Fayol, M. (2009). fMRI activation
differences between 11-year-old good and poor spellers' access in
working memory to temporary and longterm orthographic representations.
Journal of Neurolinguistics, 22(4), 327-353.
Richards, T., Berninger, V., Winn, W., Swanson, H.L., Stock, P.,
Liang, O. & Abbott, R. (2009). Differences in fMRI activation
between children with and without spelling disability on 2-back/0-back
working memory contrast. Journal of Writing Research, 1(2), 93-123.
Rittle-Johnson, B. & Siegler, R.S. (1999). Learning to spell:
Variability, choice, and change in children's strategy use. Child
Development, 70(2), 332-348.
Roberts, J. (2001). Spelling recovery. Camberwell, Vic: ACER Press.
Sharp, A.C., Sinatra, G.M. & Reynolds, R.E. (2008). The
development of children's orthographic knowledge: A microgenetic
perspective. Reading Research Quarterly, 43(3), 206-226.
Sterbinsky, A. (2007). Words Their Way spelling inventories:
Reliability and validity analyses. Memphis, Tennessee: Center for
Research in Educational Policy.
Templeton, S. & Morris, D. (1999). Questions teachers ask about
spelling. Reading Research Quarterly, 34, 102-112.
Treiman, R. (1998). Why spelling? The benefits of incorporating
spelling into beginning reading instruction. In J. Metsala & L. Ehri
(Eds.), Word recognition in beginning literacy (pp. 289-313). Hillsdale,
NJ: Erlbaum.
Treiman, R. & Kessler, B. (2006). Spelling as statistical
learning: Using consonantal context to spell vowels. Journal of
Educational Psychology, 98(3), 642-652.
Venezky, R.L. (2004). In search of the perfect orthography. Written
Language & Literacy, 7(2), 139-163.
Westwood, P. (2005). Spelling: Approaches to teaching and
assessment. Camberwell, Vic: ACER Press.
Willett, L. & Gardiner, A. (2009). Testing spelling: Exploring
NAPLAN. Paper presented at the Australian Literacy Educators Association
Conference.
Zedda-Sampson, L. (2013). Is U a word or do you spell it with a Z?
English spelling in Australian schools--Are we getting it write?
Literacy Learning: The Middle Years, 21(2), 41-51.
Tessa Daffern, Noella Maree Mackenzie, Brian Hemmings
Charles Sturt University
Tessa Daffern is a PhD candidate and Subject Coordinator in the
Masters of Education at Charles Sturt University. She is an accredited
provider of professional learning with the Teacher Quality Institute, in
the Australian Capital Territory, and regularly works with teachers in
schools as a literacy consultant and presenter. Email:
tdaffern@csu.edu.au
Noella Mackenzie is a Senior Lecturer in literacy studies in the
School of Education at Charles Sturt University, Albury. She is a member
of the Research Institute for Professional Practice, Learning and
Education (RIPPLE). Noella's current research focuses on writing,
informed by, her ongoing professional work with teachers in schools.
Email: nmackenzie@csu.edu.au
Brian Hemmings is currently the Sub-Dean (Graduate Studies) and
Deputy Director, Research Institute for Professional Practice, Learning
and Education (RIPPLE) at Charles Sturt University. He has published
widely and his most recent publications appear in Professional
Development in Education and the Australian Journal of Teacher
Education. Email: bhemmings@csu.edu.au
Table 1. Descriptive measures for CoST scores
Phonological Orthographic Morphological
Component Component Component
(31 Items) (29 Items) (41 Items)
Year 3
(n=94)
Minimum 8 2 2
Maximum 29 29 28
Mean 20.91 18.12 12.96
SD 3.94 7.17 6.63
Year 5
(n=97)
Minimum 9 3 1
Maximum 31 29 39
Mean 23.01 23.36 20.39
SD 4.39 6.07 9.12
Table 2. Internal consistency of the CoST
Phonological Orthographic Morphological
Component Component Component
Year 3 (n=94) .78 .93 .89
Year 5 (n=97) .84 .93 .94
Table 3. Summary of the Components of Spelling Test (CoST)
Subscale Phonological
Component
Constructs No. of
Items
Initial & Final 5
Consonant
Short Vowel 5
Consonant Digraphs 5
Polysyllabic-word 16
Medial Blends
Subscale 31
Items (n)
Subscale Orthographic
Component
Constructs No. of
Items
Common Long Vowels 7
Ambiguous Vowels 7
Complex Consonant 5
Patterns
Syllable Juncture 5
Consonants
Unaccented Final 5
Syllables
Subscale 29
Items (n)
Subscale Morphological
Component
Constructs No. of
Items
Inflected 7
Suffixes
Derivational 8
Suffixes
Morpheme Juncture 5
Schwa Vowels
Homophone 7
Greek and 7
Latin Roots
Assimilated 7
Prefixes
Subscale 41
Items (n)
Table 4. Phonological Component constructs
Construct Description
Initial and An initial consonant is a single non-vowel letter
final consonant that is positioned at the beginning of a word to
directly represent the first phoneme (rob).
A final consonant is a single non-vowel letter
that is positioned at the end of a word to
directly represent the last phoneme (rob).
Short vowel A single vowel letter (a/e/i/o/u) that is
positioned in the middle of a word to directly
represent a lax phoneme, which is produced when
the vocal chords are more relaxed (rob)
(Bear et al., 2012).
Consonant Consonant digraphs are two letters that represent
digraph one phoneme (Bear et al., 2012; Ganske, 2000).
Common consonant digraphs are included in this
measure and are positioned in the initial part of
words (chew/thorn/why) and the final part of
words (smooth/coach).
Polysyllabic Medial blends include two or more letters in the
word medial middle of a polysyllabic word that are highly
blend regular and phonetically represented. Each letter
represents a single regular sound. Examples of
these medial blends include: /agnosti/
(diagnostician); / ubstan/ (substantial); /libri/
(equilibrium).
Note. Also see Daffern & Mackenzie (2015)
Table 5. Orthographic Component constructs
Construct Description
Common Long Common letter patterns that represent long vowel
Vowel sounds (stripe, moat).
Ambiguous Ambiguous vowel patterns include diphthongs,
Vowel in which the sound produced by one vowel glides into
another (shouted and boil), and r influenced vowels
(marched).
Complex Consonant letter sequences occurring in any part of
Consonant a word. Letter clusters include several letters to
Cluster represent several phonemes (stripe) as well as less
common digraphs and trigraphs, which are letter
combinations that represent one phoneme
(smudged, scratches and knotted).
Syllable Consonants are sometimes doubled at the juncture
Juncture between two syllables in a word (bottle).
Consonants
Unaccented Letter sequences found in words where the last
Final syllable is not stressed (bottle and tunnel).
Syllables
Note. Also see Daffern & Mackenzie (2015)
Table 6. Morphological Component constructs
Construct Description
Inflected Suffix Suffixes that change the verb tense
(march-ed) or number (dog-s).
Derivational Suffix Morphemes added to the end of base
words that affect the meaning and/or
part of speech (domin-ance).
Morpheme Juncture Schwa The unstressed syllable in
morphologically complex words contains a
reduced vowel sound (opposition).
These words contain a phonological shift
at the morpheme juncture.
Homophone The meaning of the homophone
is understood in the context of a
sentence and it is represented using
correct letters sequences (waist/waste).
Greek and Latin Root A unit of meaning deriving from Greek or
Latin origin in which prefixes and
suffixes are added (psychology: psych
means 'spirit' or 'soul' in Greek).
Most roots are bound morphemes, or
incomplete words.
Assimilated Prefix Also known as absorbed prefixes.
The sound and spelling of the final
consonant is 'absorbed' into the initial
consonant of a base word or root to
which the prefix is affixed (an-notate).
Note. Also see Daffern & Mackenzie (2015)