The assessment of reading comprehension cognitive strategies: practices and perceptions of Western Australian teachers.
Oakley, Grace
Context for the study
The study being reported here set out to investigate how teachers
of 10- to 12-year-old children (Years 5 to 7 in WA) teach and assess
reading comprehension cognitive strategies (RCCS) and how confident they
feel about their teaching and assessment practices in this important
area. Although the study primarily aimed to find out about assessment
practices, it was also necessary to ask about teaching, since the two
are inextricably linked. In this article, the emphasis is on the
participating teachers' perceptions and self-reported practices in
the assessment of RCCS.
For the purposes of the present study, reading comprehension is
defined as 'the ability to derive meaning from text' (Rathvon,
2004, p. 156) and is deemed to be the ultimate aim of most reading
activity. Some 30 years ago, Durkin (1978) found that comprehension was
rarely taught explicitly, if at all, in classrooms. Since then,
researchers have put considerable effort into investigating how reading
comprehension processes and strategies might be taught, and research
evidence indicates that the teaching and learning of cognitive
strategies is highly beneficial in improving reading outcomes (e.g.
Pressley, 2000). In recent years, the importance of cognitive and
metacognitive comprehension processes and strategies has been
foregrounded (Block, Rodgers & Johnson, 2004), with the US National
Reading Panel (National Institute of Child Health and Human Development,
2000) acknowledging their centrality by conceptualising reading as an
active process that is directed by intentional thinking. According to
this view, readers need to make meaningful connections between their
thinking processes, the text, and their own prior knowledge. Thus, in
order to comprehend texts efficiently, they need to not only be able to
identify the words (graphophonic skills and sight word knowledge), have
knowledge of grammar and syntax, have an appropriately developed spoken
vocabulary, knowledge of text structures, and some relevant background
knowledge to bring to the text, but they also need to be able to choose,
use and evaluate a range of RCCS, such as inferring, creating mental
imagery, self-monitoring for meaning, clarifying, summarising and
predicting (Duke & Pearson, 2002; Irwin, 1991; Keene &
Zimmerman, 2007; Palincsar & Brown, 1984; Pressley, 1999, 2002;
Williams, 2002; Zimmerman & Keene, 1997). These strategies are
essentially ways of thinking, and their effective use involves
metacognition, or the ability to think about thinking. Much is now known
about teaching RCCS, although it has been suggested that research
findings have not necessarily been successfully transferred to classroom
teaching contexts (Allen & Hancock, 2008).
The importance of readers' ability to select and use RCCS has,
in recent years, been recognised through its inclusion in literacy
curricula, both in Australia and internationally. For example, in WA,
teachers must teach a 'processes and strategies' aspect in
reading (Curriculum Council, 1998), and the Draft National Curriculum
(English) (ACARA, 2010) includes comprehension strategies as essential
areas of achievement. Clearly, in order to effectively teach these
strategies, teachers need to be able to assess them, since it is
impossible to target teaching without good assessment data (assessment
for learning). However, because RCCS are not directly observable, they
can be difficult to assess. Often the teacher can only infer the
cognitive processes being used by children through the analysis of
comprehension products or representations, such as written work, role
plays, conversation, and so on. In other words, students' thinking
somehow needs to be made tangible or visible so that teachers can
attempt to assess it. Unfortunately, there is relatively little
direction available in the literature on how teachers might best assess
RCCS in real classroom contexts.
What does the literature say?
The literature indicates that reading comprehension assessment
needs to involve more than comprehension-check questions as in
standardised testing, since such assessments will not provide a teacher
with a full continuum of a student's comprehension capabilities and
areas of need, and will thus fail to inform instruction (Fiene &
McMahon, 2007; Wade, 1990; Oakley & Barratt-Pugh, 2007). On the
other hand, effective ongoing classroom-based assessment can result in a
series of assessments that show progress over a period of time and over
multiple contexts as opposed to a snapshot on a particular day. This
type of assessment is more likely to allow teachers insights into the
processes or strategies that children use to make and check meaning.
As noted above, finding effective ways to assess children's
RCCS can be a difficult and frustrating task for teachers (Israel,
Bauserman, & Block, 2005). Yet it is essential to attempt do so in
order to inform instruction and provide useful feedback to children.
Sometimes children can misuse strategies, for example by making images
and inferences that are tangential to the text (Block & Pressley,
2007), and it is important for teachers to find out if this is happening
in order to remedy it.
In teaching and assessing RCCS, it is necessary to also teach and
assess metacognition, as this is what enables children to choose and
evaluate appropriate cognitive processes. Almasi (2004) summarises that
children need to learn declarative, procedural and conditional knowledge
about comprehension strategies, and all three categories of knowledge
need to be assessed. Declarative knowledge is essentially what
knowledge, or knowledge about the strategy and what it is. Procedural
knowledge concerns how to carry out the strategy, and declarative
knowledge concerns when and why to use the strategy, and this involves a
high degree of metacognition.
The literature on the subject, which is still somewhat limited,
describes several techniques for assessing RCCS, including questioning,
think alouds, interviews and surveys, and analysis of artefacts. These
approaches will now be briefly outlined and critiqued. Questioning has
always been a popular means of assessing comprehension, and this can be
used effectively to probe children's RCCS. For example, a set of
questions designed by Keene (2006, p. 55) to assess making connections
includes: 'When you read (or listened) to the text, did it remind
you of anything you know about and believe? What? Why did it remind you
of that? ... Did it remind you of any experiences or things that have
happened before?'
Questions designed for this purpose need to focus on the
child's thinking, and not on the text and its contents. Questions
can be presented either orally or in writing, although it should be
remembered that for young children or those who have difficulties in
literacy, questions that require written answers are not the optimal
assessment type since responses will be limited by the level of
children's writing ability.
Verbalised thinking or 'thinking aloud' can provide
highly valuable information about a child's cognitive processes and
may allow insight to the reasoning underpinning cognitive behaviours
(Wade, 1990). Think alouds require the child to say aloud what comes to
mind as she or he reads a text (or has a text read aloud to them). In
order for children to be able to do this effectively, though, they need
to have witnessed teachers thinking aloud during comprehension
instruction on many occasions; the importance of teacher modelling
cannot be emphasised enough. To facilitate think alouds, such devices as
'Stop and Think Cards' (Annandale et al., 2004b) or stickers
making 'thinking suggestions' can be inserted in pre-selected
places in the text. Students can also use sticky notes to record their
thoughts as they progress through texts (Fiene & McMahon, 2007).
However, think alouds do have limitations in that readers may not be
fully aware of what they are thinking. Furthermore, some children may
find it very difficult to articulate their thinking.
The literature indicates that the interview can be a highly
effective means of assessing the thinking children do when attempting to
construct meanings of texts. It is suggested that Reflective
Metacognitive Interviews may be useful, although there are obviously
limitations in that people (and children especially) cannot possibly be
fully aware of all of their thinking (e.g. Nisbett & Wilson, 1977),
as noted above. Reflective Metacognitive Interviews can be designed by
teachers and their aim is to encourage children to describe how they
read a text and why they did it that way. They assess declarative,
procedural and conditional knowledge (Almasi, 2004). As well as
interviews, there are several surveys and inventories available to
assist teachers in assessing children's cognitive and metacognitive
processes in reading. Schmidt (1990) designed a Meta-comprehension
Strategy Index during the 1990s, which is a self reporting instrument
that asks students about strategies they might use before, during and
after reading a narrative text. This assessment is multiple-choice in
format and assesses students' meta-comprehension actions, broadly
categorised into predicting/verifying, previewing, purpose, self
questioning, drawing from background knowledge and summarising/ fix up
knowledge. It should be noted that self-reporting instruments, although
valuable in that they encourage students to think about their thinking,
are limited in that children may not accurately report their thinking.
The Reading Strategy Awareness interview (Miholic, 1994) is another
multiple choice survey that probes children's self-monitoring and
awareness of reading strategies.
Another potentially useful inventory that aims to increase
children's metacognitive awareness and strategy use during reading
is the Metacognitive Awareness of Reading Strategies Inventory (MARSI)
(Mokhtari & Reichard, 2002). This inventory was designed for use
with students in Grades 6 to 12 and focuses on the transference of
responsibility for monitoring meaning from the teacher to the student,
increasing student awareness of strategy use and providing teachers with
a means of assessing, monitoring and documenting the type and number of
reading strategies used by students. Because it includes statements such
as: 'I stop from time to time to think about what I'm
reading', the MARSI is a form of self-assessment. Answers are on a
1-5 scale with 1 being 'I almost never do this'.
Self-assessments can be powerful means of assessment as they help
children think about areas that they might need to improve on (in other
words, they are educative), and can be motivational in that they
encourage ownership and a feeling of control. Obviously, limitations are
as for self-reports.
Several researchers have developed rubrics to guide assessment of
RCCS. Keene (2006, p. 63) has devised a series of rubrics to guide
teachers in observing and recording comprehension thinking strategies.
These rubrics comprise statements such as: '[Asks] no questions
and/or poses irrelevant questions; poses literal question(s) that relate
to the text; poses questions that clarify meaning'. Another
relevant rubric is the IRIS (Rogers et al., 2006), which can assist
teachers in assessing upper primary school children's RCCS,
specifically their ability to make connections, synthesise, monitor for
meaning, actively construct meaning through predicting, hypothesising
and questioning, and engaging with the text through such activities as
visualising. This rubric employs a four point scale, ranging from
'not yet meeting expectations' to 'exceeding
expectations'. A set of interview questions is also provided to
help teachers find out what children say about their strategy use. It
should be noted that rubrics pre-suppose that teachers have effective
methods for collecting relevant data about children's strategy use.
Miscue analysis, which has been in use in classrooms for many
years, can provide some useful insights about what occurs during
reading, confirm children's use of strategies, indicate self
correcting behaviours, and indicate whether a child is monitoring their
comprehension (Israel et al., 2005). Running records (Clay, 2002) can
serve the same purpose and are thus of some value.
Finally, teachers may analyse written products or artefacts that
represent children's thinking. For example, in assessing
visualisation or making mental imagery it is possible to analyse
children's drawing or 3D models, or even dramatic role play or
tableaux. To evaluate summarisation, a written or oral summary might be
analysed. To assess questioning, the teacher might look at sticky notes
(in situ) that students have attached to texts, showing the questions
they asked. Discussion with children about their representations will
enhance such assessment.
To conclude the review of the literature, it appears that whilst
research on the teaching of RCCS is fairly robust, research on its
effective assessment has not been vigorous or particularly coherent. As
outlined above, some work has in recent years been conducted on
assessment, especially in the development of assessment instruments and
techniques; however, little research about how teachers use these
techniques in their classrooms, and the usefulness of these strategies,
has been conducted.
Method
The present study utilised survey research followed by
semi-structured interviews. Government, Independent and Catholic schools
in WA were sent letters and then emails inviting them to participate in
the research. A live link to an online survey was provided. Schools
requesting hard copy surveys were sent them. School principals who gave
consent for their school to participate in the survey then forwarded the
link to Year 5, 6 and 7 teachers. To encourage a good response, research
assistants preceded and followed up the email with telephone calls to
principals, where possible. Ninety three teachers completed the survey
from schools in all sectors, from both metropolitan and country areas of
WA.
The survey enquired about teachers' procedures for teaching
and assessing RCCS. Although assessment was the primary focus, teaching
also needed to be investigated since teaching and assessment are
inextricably linked. Other information collected through the survey
included demographic details such as: the teacher's qualifications,
number of years of teaching, gender, and professional development
received in RCCS instruction and assessment. In addition, information
relating to teachers' confidence in teaching and assessing RCCS was
collected using Likert scales (see Appendix 1 for summary of survey
questions).
The second phase of the research involved eight semi-structured
interviews of approximately one hour each, to collect qualitative data
about the assessment of RCCS strategies. The interviewees were asked to
describe their assessment practices and articulate their reasons for
using them. They were also asked for perceived limitations of the
techniques and what kinds of professional development they thought they
required. These qualitative data were analysed using Miles and
Huberman's (1984) content analysis techniques. The current article
mainly discusses the survey data, although some interview data are used
for clarification purposes.
Participants
The survey respondents were teachers in both Government and
Independent (including Catholic) schools. Seventy two percent were
female and 28% were male. Respondents had been teachers for varying
lengths of time, ranging from less than five years (34%) to veteran
teachers of more than 31 years or more in the teaching profession (14%).
Findings and discussion
In terms of teaching RCCS, most of the respondents reported that
were reasonably confident and that they taught a variety of strategies.
However, they were less likely to teach visualisation (making mental
images) as a strategy and even less likely to teach the metacognitive
skill of monitoring meaning. In terms of assessment, many teachers
reported a lack of confidence, as will be elaborated below.
The teaching of RCCS
Overall, 90% of the teachers indicated that they teach children how
make inferences. With regards to making connections, 78.5% overall
claimed to teach this. Ninety four percent reported that they teach
summarising, whilst 90% indicated that they teach children how to ask
questions of the text. However, only 53% percent reported that they
teach children how to visualise when reading texts.
Cross-tabulation of the data showed that there were some
interesting differences in practices reported by teachers with different
levels of experience. Forty eight percent of teachers who graduated
within the last 10 years stated that they teach visualising, as opposed
to only 26% of the more experienced teachers. In terms of teaching
metacognitive strategies, 39.5 % overall reported that they teach self
monitoring for meaning. However, more than twice as many of the teachers
who had graduated in the last ten years claimed that they taught self
monitoring than did the more experienced teachers, with 50% of the
former stating that they teach self-monitoring for meaning, as opposed
to only 25% of the latter. Some teachers reported that they did not
teach any RCCS whatsoever, with one teacher who had between 11 and 15
years teaching experience writing: 'I don't know enough about
it to teach it.' According to this study, teachers who graduated
more than ten years ago appear to require some professional learning
opportunities in this area to increase the range of RCCS being taught by
all teachers.
[FIGURE 1 OMITTED]
Confidence in teaching RCCS
Only 26% of the respondents felt 'very' confident about
their ability to teach RCCS, with 11% overall stating that they were
'not very' confident. A very large 33% of new graduates (with
2 years or less in teaching) stated that they were 'not very'
confident. The rest (63%) of the respondents were a lukewarm
'fairly' confident. It appears that confidence in teaching
RCCS may increase somewhat with experience.
Procedures used to assess RCCS
When asked to list and briefly describe the procedures they used to
assess comprehension processes and strategies, respondents reported
using a variety of techniques. Clearly, participants may not have
mentioned all of the techniques they used but it is reasonable to
suppose that those mentioned would be the most salient in their view.
There was a heavy emphasis on the collection and analysis of
children's written work and concrete artefacts, although several
teachers mentioned the analysis of drawings and role play as part of
their repertoire. The analysis of children's work may be seen as a
focus on the product of comprehension as opposed to cognitive processes
and strategies. Limited diagnostic data would be available from this to
help teachers target teaching for those children needing further
instruction in specific areas. In reporting the procedures used to
assess RCCS, none of the participants mentioned think alouds and only
one person mentioned the use of specific interviews and inventories.
Seventeen percent of the teachers surveyed indicated that they used
questioning to assess RCCS. However, almost all of these teachers
mentioned questioning the comprehension product rather than the process,
using the three-level questioning technique. One teacher wrote: 'I
assess the product rather than what children are doing when they are
reading.' In an interview, another said that she focuses very much
on assessing literal comprehension (products), mainly through oral and
written questioning.
Discussion was another assessment technique mentioned frequently in
the survey responses, although it is acknowledged that this survey did
not deeply interrogate the ways in which discussion is used to assess
RCCS. One teacher stated that during discussion she would 'ask the
kids to describe how they might use ... or have used ... a specific
strategy to understand the text at hand'. This would then be
discussed either with the teacher or peers. If used appropriately, as by
this participant, discussion can be a highly effective means of
assessing processes.
In the survey, there were three references to self-assessment,
three to peer assessment, one reference to rubrics and eleven to
observation of children during classroom activities. One teacher
mentioned observation during Reciprocal Teaching (RT) (Palincsar &
Brown, 1984), which would have been highly focussed on the
children's use of RCCS. The RT strategy focuses on teaching
children how to predict, question, clarify and summarise the texts that
they read, and this is done in small group contexts and involves a good
deal of discussion. Teachers can either listen in to children's
discussion about their thinking or can ask them to write brief notes
about their thinking.
Many teachers mentioned formal comprehension tests such as: the
Australian National Assessment Program--Literacy and Numeracy (NAPLAN),
which is composed of short texts in the form of a magazine and multiple
choice questions; the Tests of Reading Comprehension (TORCH), which is a
standardised cloze test (Mossenson, Hill, & Masters, 2003); and the
Progressive Achievement Tests in Reading (PAT-R) (ACER, 2008), which is
comprised of short texts and multiple choice questions. None of these
assessments would be highly useful in assessing processes, as the
comprehension product is the focus of these tests. Other teachers relied
heavily on commercial schemes with their associated worksheets. Many of
the assessment methods mentioned in the survey were probably not highly
effective in assessing the comprehension processes used by students.
This suggests that teachers may need guidance in choosing appropriate
strategies. A framework such as the one used by Magliano and colleagues
(Palincsar et al., 2007), might be useful in encouraging teachers to
choose assessments on the basis of text characteristics, the assessment
goal (e.g. process or products), the reader and the reading task.
Barratt-Pugh and Oakley (2007) have also suggested that teachers use
clear criteria in selecting assessment procedures. In order to use such
frameworks appropriately, teachers need a very deep understanding of how
children learn how to read, specifically how they learn RCCS. It is not
sufficient in teaching and assessing RCCS to simply know a set of
strategies and procedures; research shows that teachers need to
understand the principles underlying the practices (Palincsar et al,
2007).
Confidence in assessing RCCS
As shown in Figure 2, the majority of the teachers surveyed
reported that they felt 'fairly confident' about their ability
to assess comprehension, with teachers who had the most experience
generally tending to feel the most confident, along with those who had
participated in what they saw as relevant professional development.
[FIGURE 2 OMITTED]
However, 6% of new graduates (less than two years teaching
experience) indicated that they were 'not at all' confident,
and another 44% stated they were 'not very' confident. It is a
great concern that half of new teachers did not feel confident in this
area, although it is not possible to say whether their degree of
confidence was related to their level of competence. Twenty three
percent of teachers with three to five years experience and 27% with six
to ten years experience were 'not very' confident.
Approximately a quarter of more experienced teachers, thus, were fairly
low in confidence, and this is also a concern. It would be advantageous
to help all teachers feel 'very' confident in this area of
their work since feeling less than confident can be stressful for
teachers and may, indeed, be indicative that their practices are not
optimal.
In-service professional development in assessing RCCS
Fifty three percent of the teachers indicated that they had
received some professional development (PD), or professional learning,
in assessing RCCS and, as might be expected, those who had received some
in-service training tended to feel more confident about their ability to
assess children's reading in this area (Figure 3).
[FIGURE 3 OMITTED]
Most respondents stated First Steps (e.g. Annandale et al, 2004a;
Annandale et al, 2004b) as the professional development received, with
58% of government teachers mentioning this. First Steps is a series of
resources that views literacy learning as a developmental process which
occurs within a socio-cultural context. In terms of assessment of
cognitive strategies, the First Steps Reading Map of Development
(Annandale et al, 2004a) outlines a variety of useful strategies, such
as encouraging children to discuss and/or write down their
self-assessments and reflections regarding their use of strategies, and
the use of think alouds as a means of encouraging children to articulate
their thoughts before, during and after reading. A few respondents also
mentioned receiving professional development in 'Making Consistent
Judgements', which is a WA Department of Education package that
encourages teachers to use exemplars and guidelines to help them make
consistent judgements. The focus here is on products, not processes.
Although teachers who had received in-service professional
development tended to be more confident, some found the offerings to be
of little value: 'I found that most of the information was just
rehashing old information.' It is also interesting that teachers
did not always appear to apply the contents of PD, such as the stop and
think cards and think alouds described in First Steps.
Concluding comments
The majority of teachers surveyed report that they are attempting
to teach RCCS, although fewer teach visualising or making mental imagery
and less than 40% claim to teach the metacognitive skill of
self-monitoring for meaning. Although teachers say that they teach RCCS,
they are not always confident in this area.
In terms of the assessment of RCCS, this study indicates that many
teachers lack confidence and feel inadequately prepared. Confidence,
perhaps not surprisingly, appears to be linked to years of experience
and to professional development received. It is not clear, however,
whether some of the confidence felt by more experienced teachers may be
misplaced since some who felt confident had learnt informally, though
experience, and not through formal teacher education or professional
development. Also, many of the more experienced teachers taught a
narrower range of cognitive strategies, which might simplify assessment
requirements. This is an area that requires further investigation.
Qualitative responses indicate that there was a heavy reliance on
the analysis of comprehension products in order to infer effective
processes. Teachers reported that when they used questioning, they often
used it to probe comprehension products rather than the RCCS. According
to the literature, this is not likely to be the most effective means of
assessing this important area of learning. The present study suggests
that there may be a need for more, or qualitatively different,
professional development to assist teachers in WA effectively assess
RCCS. Since assessment of these cognitive strategies seems to be a
relatively under-researched and under-discussed area, it may well be the
case that the assessment of RCCS needs to be upgraded outside WA also.
Before concluding, it is necessary to discuss the potential
limitations of the study. One limitation stems from the fact that only
93 surveys were collected and that participating schools were
self-selected in that principals acted as 'gatekeepers' in
deciding whether surveys would be presented to teachers or not. Although
teachers of all levels of experience were represented, it is likely that
'early career' teachers (with less than five years experience)
were over-represented in the sample. On average, Australian primary
school teachers have 17 years teaching experience, with 17% being
'early career' teachers of less than 5 years experience. In
this study, 34% [degrees] had less than five years experience. It is
likely that the number of early career teachers in WA is higher than the
national average of 17% because of the unique economic circumstances of
this state and the high mobility of the workforce, but accurate
statistics were not available at the time of writing.
Another limitation is that the research relied on self-report data.
For a number of reasons, there may be cases where self-reports are not
fully accurate. Also, a few of the open ended questions yielded somewhat
brief responses that were difficult to interpret. For example, many
respondents wrote that they used 'observation' as a means of
assessing children's comprehension strategies but sometimes did not
indicate what it was they observed. This could be an indication that
some teachers do not differentiate between data and the data collection
method, which is a fairly common assessment error. The limitation just
mentioned was to some extent ameliorated by the use of follow up
interviews for eight of the respondents.
In conclusion, it can be argued that this study does provide some
legitimate insight into what teachers in WA are thinking, feeling and
doing in the teaching and assessment of RCCS. It also appears to alert
us to the possibility that many teachers would benefit from additional
or different professional learning and support. In addition, the study
reveals a need for further investigation into several aspects of the
assessment of RCCS, perhaps most urgently research on 'what
works' for teachers who show exemplary teaching and assessment
practices in this area.
Appendix 1
SUMMARY OF SURVEY QUESTIONS
About you
1. How many years have you been a Primary School Teacher?
0-2 [] 3-5 [] 6-10 [] 11-15 [] 16-20 [] 20-25 [] 25-30 [] 30+ []
2. Which year level do you currently teach?--(Yrs 5--7)
3. How long have you taught this year level?--years
4. What qualification do you hold?
Bachelor of Education [] Graduate Diploma [] Diploma in Teaching []
Other []
Post-Graduate [] Name of post-graduate qualification: --
5. Gender: Male [] Female []
About your teaching of reading comprehension
6. Which comprehension processes do you teach the children in your
class? (Tick boxes)
Visualising/Mental Images [] Summarising [] Other []
Inferring [] Questioning []
Making connections [] Predicting []
Self monitoring [] Clarifying []
7. For the comprehension processes that you teach, please list
and/or briefly describe the teaching and learning strategies that you
employ.
Visualising/Making mental images:
*Expanding text boxes were also provided for: Predicting,
inferring, making connections, self monitoring, summarising,
questioning, clarifying, other.
8. Please briefly describe the procedures that you use to assess
the comprehension processes that you teach. Visualising/Making mental
images:
*Expanding text boxes were also provided for: Predicting,
inferring, making connections, self monitoring, summarising,
questioning, clarifying, other.
9. How well do you think your initial teacher training prepared you
for teaching comprehension processes?
Very well [] Well [] Adequately [] Poorly [] Not at all []
10. How well do you think your initial teacher training prepared
you for assessing comprehension processes?
Very well [] Well [] Adequately [] Poorly [] Not at all []
11. What professional development in this area have you had, if
any?
12. How well has in-service professional development prepared you
for teaching comprehension processes?
Very well [] Well [] Adequately [] Poorly [] Not at all []
13. How well has in-service professional development prepared you
for assessing comprehension processes?
Very well [] Well [] Adequately [] Poorly [] Not at all []
14. How confident do you feel about your ability to teach
comprehension processes?
Very confident ? Fairly Confident ? Not very confident ? Not at all
confident ?
15. How confident do you feel about your ability to assess
comprehension processes?
Very confident [] Fairly Confident [] Not very confident [] Not at
all confident []
16. Please add any other comments that you may have about the
teaching and assessment of comprehension processes.
References
ACARA (2010). Australian Curriculum: Draft consultation version
1.1.0 (English). Journal. Retrieved from
http://www.australiancurriculum.edu.au/Documents/K10/
English%20curriculum.pdf
Allen, K.D., & Hancock, T.E. (2008). Reading Comprehension
Improvement with Individualized Cognitive Profiles and metacognition.
Literacy Research and Instruction, 47(124-39).
Almasi, J.F. (2004). Teaching strategic processes in reading NY:
The Guilford Press.
Annandale, K., Bindon, R., Handley, K., Johnston, A., Lockett, L.,
& Lynch, P. (2004a). Reading map of development: Addressing current
literacy challenges (2nd ed.). Port Melbourne: Reed International.
Annandale, K., Bindon, R., Handley, K., Johnston, A., Lockett, L.,
& Lynch, P. (2004b). Reading resource book: Addressing current
literacy challenges (2nd ed.). Port Melbourne: Reed International.
Australian Council of Educational Research. (2008). Progressive
achievement test in reading--revised (PAT-R): ACER Press.
Barratt-Pugh, C., & Oakley, G. (2007). The identification of
assessment resources to support children learning to read in the early
years of school: Report. Perth: Government of Western Australia.
Block, C.C., & Pressley, M. (2007). Best practices in teaching
comprehension. In L.B. Gambrell, L. Morrow & M. Pressley (Eds.),
Best practices in literacy instruction (pp. 220-242). New York: The
Guilford Press.
Block, C.C., Rodgers, L.I., & Johnson, R.B. (2004).
Comprehension process instruction: Creating reading success in grades
K-3. New York: The Guilford Press.
Clay, M., M. (2002). An observation survey of early literacy (2nd
ed.). Auckland: Heinemann.
Curriculum Council. (1998). Curriculum framework for kindergarten
to year 12 education in Western Australia. Osborne Park: Curriculum
Council.
Duke, N.K., & Pearson, P.D. (2002). Effective practices for
developing reading comprehension. In A.E. Farstrup & S.J. Samuels
(Eds.), What research has to say about reading instruction (3rd ed., pp.
205-242). Newark, Delaware: International Reading Association.
Durkin, D. (1978). What classroom observations reveal about reading
comprehension instruction. Reading Research Quarterly, 14, 481-538.
Fiene, J., & McMahon, S. (2007). Assessing comprehension: A
classroom based process
The Reading Teacher, 60(5), 406-417. Irwin, J.W. (1991). Teaching
reading comprehension processes (2nd ed.). Englewood Cliffs, NJ:
Prentice Hall.
Israel, S.E., Bauserman, K.I., & Block, C.C. (2005).
Metacognitive assessment strategies. Thinking Classroom, 6(2), 21-28.
Keene, E. (2006). Assessing comprehension thinking strategies.
Huntington Beach, California: Shell.
Keene, E., & Zimmerman, S. (2007). Mosaic of thought: The power
of comprehension strategy instruction (2nd ed.). Portsmouth, NH:
Heinemann.
Miholic, V. (1994). An inventory to pique students'
metacognitive awareness of reading strategies. Journal of Reading,
38(84-86).
Miles, M.B., & Huberman, A.M. (1984). Qualitative data
analysis: A sourcebook of new methods. Beverley Hills: California.
Mokhtari, K., & Reichard, C.A. (2002). Assessing students'
metacognitive awareness of reading strategies. Journal of Educational
Psychology, 94(249-259).
Mossenson, L., Hill, P., & Masters, G. (2003). TORCH: tests of
reading comprehension. Melbourne: Australian Council for Educational
Research.
National Institute of Child Health and Human Development. (2000).
Report of the National Reading Panel. Teaching children to read: An
evidence-based assessment of the scientific research literature on
reading and its implications for reading instruction (No. 00-4769).
Washington, DC: Government Printing Office.
Nisbett, R.E., & Wilson, T.D. (1977). Telling more than we can
know: Mental reports on mental processes. Psychological Review,
84(231-259).
Oakley, G., & Barratt-Pugh, C. (2007). The identification of
assessment resources to support children learning to read in the early
years of school: Literature review. Perth: Edith Cowan University
Palincsar, A.S., & Brown, A.L. (1984). Reciprocal teaching of
comprehension-fostering and comprehension-monitoring activities.
Cognition and Instruction, 1, 117-175.
Palincsar, A.S., Spiro, R.J., Kucan, L., Magnusson, S.J., Collins,
B., Hapgood, S., et al. (2007). Designing a hypermedia environment to
support comprehension instruction.
In D.S. McNamara (Ed.), Reading comprehension strategies: Theories,
interventions and technologies. New York: Laurence Erlbaum Associates.
Pressley, M. (1999). Self-regulated comprehension processing and
its development through instruction. In L.B. Gambrell, L.M. Morrow, S.B.
Neuman & M. Pressley (Eds.), Best practices in literacy instruction.
NY: The Guilford Press.
Pressley, M. (2000). What should comprehension instruction be the
instruction of? In M.L. Kamil, P.B. Mosenthal, P.D. Pearson & R.
Barr (Eds.), Handbook of Reading Research (Vol. Volume III, pp.
545-561). Mahwah, NJ: Lawrence Erlbaum.
Pressley, M. (2002). Metacognition and self-regulated
comprehension. In A.E. Farstrup & S.J. Samuels (Eds.), What research
has to say about reading instruction (3rd ed., pp. 291-309). Newark,
Delaware: International Reading Association.
Rathvon, N. (2004). Early reading assessment. New York: The
Guilford Press.
Rogers, T., Winters, K.L., Bryan, G., Price, J., McCormick, F.,
House, L., et al. (2006). Developing the IRIS: Toward situated and valid
assessment measures in collaborative professional development and school
reform in literacy. The Reading Teacher, 59(6), 544-553.
Schmidt, M.C. (1990). A questionnaire to measure children's
awareness of strategic reading processes. The Reading Teacher,
43(454-461).
Williams, J.P. (2002). Reading comprehension strategies and teacher
preparation. In A.E. Farstrup & S.J. Samuels (Eds.), What research
has to say about reading instruction (3rd ed., pp. 243-260). Newark,
Delaware: International Reading Association.
Zimmerman, S., & Keene, E.O. (1997). Mosaic of thought:
Teaching comprehension in a readers workshop: Heinemann.
Grace Oakley
Graduate School of Education, University of Western Australia