Relating students' spoken language and reading comprehension.
Daly, Ann
Background to research study
This research was part of a doctoral research study (Daly, 2011) associated with an Australian Research Council (ARC) linkage project between the University of New England and the then NSW Department of Education and Training (now NSW Department of Education and Communities). The main ARC research project was concerned with developing a model of image-language relations.
Previous articles (Daly & Unsworth, 2011; Unsworth & Chan, 2008) have discussed the research findings from the main ARC project in relation to the construction of intermodal meaning in texts, the complexity of language in the texts, students' reading strategies and how these factors might affect reading comprehension. For example, it was found that it was relatively easy for students to comprehend intermodal meanings that were concurrent (equivalent) between images and written text, but students found it more difficult to comprehend intermodal meanings where there was complementarity between text and image, that is, where meaning in one part of the text added to meaning in the other. As expected, it was also found that greater complexity in the written language was related to increased difficulty of the questions (Daly & Unsworth, 2011).
Reading is a complex skill and comprehension of texts is affected by many factors such as the background knowledge about words and concepts that students bring to a text. However, during interviews some students indicated that they understood the individual words in a paragraph with complex sentences, but they did not understand 'the way they [the authors] said it'. It also became apparent that many students with poor reading comprehension spoke in short phrases or simple sentences even if they had a lot to say during the interviews. These two observations suggested a problem with expression and reception of grammatically complex language and worth investigating whether there is a relationship between spoken language and reading comprehension.
Previous research
Bernstein (1974) contends that restricted and elaborated language codes are used by different groups of people according to their social position. This suggests that elaboration or complexity in spoken language could be a factor in regard to achievement in life. However, there has been limited research into the correlation between reading comprehension and oral language development, other than research in EFL (Reza, 2010) and learning disabilities (Burkhour, 1999). Like other research into oral language (Hart & Risley, 1995; Biemiller, 2003), these studies have concerned the number of words in spoken vocabulary rather their complexity or the complexity of the grammar (syntax) as in elaborated language codes.
Research carried out with 22 Spanish bilingual learners in kindergarten in New York correlated syntax in both L1 and L2 with scores on the English Pre-reading level of the Gates MacGinitie Reading Test (Martohardjono et al., 2005). The researchers found significant correlations that were strong for both L1 and L2 and 'pointed to a strong relationship between syntactic skills and listening comprehension in young English language learners' (Martohardjono et al., 2005, p. 1538). However, this research was based on measures of oral comprehension of syntax (reception of spoken language) rather than measures of complexity in their spoken language (production of language).
In regard to the production of language, the vocabulary of a 3-year-old in a professional household was found by Hart and Risley (1995) to be larger than the vocabulary of an adult in a welfare household; and by age four, children in professional households had heard 45 million words compared with 13 million words in welfare households. Hart and Risley's (1995) findings about poor families were based on 'only six cases, all African-American, all in Kansas City, all on welfare' (Bomer, Dworin, May & Semingson, 2009, p. 2), which is a genuine criticism. However, the contention of Bomer et al. (2009), that parents do not use their entire vocabularies when talking to their children, seems to miss the point that parent-child interchange is what counts in language development (Williams, 1994; Heath, 1982) and a collaborative, conversational approach (like the parent-child interchange) is the best form of instruction for comprehension (Scull, 2010).
Ability in auditory processing, which precedes reading comprehension, was considered by Beron and Farkas (2004, p. 112) to be influenced by the oral language that the child takes from family and community. Other research studies also suggest that there is a link between linguistic competence at school and prior to school. The links have been found in relation to class (Goodman, 1990) and differences between teacher language and home language (Evans, 1994). More specifically differences in vocabulary on entry to kindergarten have been related to cumulative vocabulary deficits in less advanced children as they progress through school (Biemiller, 2003).
Hill and Launder (2010, p. 243) developed a 'tailored oral language intervention program designed to develop oral language structures and vocabulary in rich play contexts.' However, no sequential relation was detected when only vocabulary and reading were assessed. The researchers noted that written text syntax is embedded with more adjectival and adverbial sentence clauses than spoken language and they suggested 'future research may benefit from analysis of the relationship between children's use of complex oral language structure and early reading' (Hill & Launder, 2010, p. 251).
Syntax was also found to be relevant in a study by Hay, Elias, Homel, Freiberg, Ernst and Jensen (2003) who found that 44% of the children, in a disadvantaged multicultural community with a substantial Indigenous population, had language development delays and 38% had a delay in receptive vocabulary. The researchers suggested that 'teachers should concentrate on strategies that enhance children's vocabulary development and facilitate their use of more advanced and complex syntax' (Hay et al., 2003, p. 44). Recent research concerning children in pre-school and the first years of school supports the notion of oral language competencies underpinning children's transition into literacy (Hay & Fielding-Barnsley, 2009, p. 158; Munro, 2011).
However, it is not only the transition into literacy that might be affected by oral language competencies. As children progress in school the cognitive and linguistic demands on the reader are increased in more complex texts which require knowledge of less familiar words and language patterns (Chall & Curtis, 1991, p. 351). Accordingly, this research has focused on primary age students.
The increase in complexity and density of language in written language when compared to spoken language has been represented on a continuum by Martin (1985). Grammatically complex language has dependent clauses and would be mid-way along this continuum which extends from grammatically intricate everyday conversation with independent clauses to dense formal academic written language. The increase in complexity within spoken language has been further detailed in a continuum adapted by Jones as shown in Figure 1.
In view of the observations during interviews and the perceived gaps in the research, it was decided to investigate whether a relationship exists between primary students' spoken language complexity and their reading comprehension. Accordingly, one of the questions posed by the doctoral research was,
When students talk about images and verbal text ... are there significant relationships between reading score, the number of inferences made and the amount of linguistic complexity in students' talk about images and verbal text? (Daly, 2011, p. 12)
When significant quantified relationships were found between the reading comprehension scores and the number of instances of complexity in students' talk about texts, an additional exploratory analysis was carried out among groups of students in the older cohort (all students, Indigenous students and non-Indigenous students) among the following variables:
* the BST reading scores (the year prior to the interview)
* the post-test reading scores (the year after the interview)
* percentage of dependent clauses (spoken grammar)
* instances of non-core words (spoken vocabulary)
* instances of non-standard (Aboriginal English) grammar used
(Daly, 2011, p. 305)
The purpose of including the first two variables in the additional exploratory analysis was to investigate whether a significant relationship would exist for reading results the year after the interviews as well as the year prior to ensure the lapse in time was not affecting results. The next two variables would clarify whether complexity in vocabulary alone and/or grammatical structure alone in spoken language was related to reading comprehension to identify whether both variable were related to reading or only one. The final variable would establish whether there could be a relationship between reading comprehension and the use of any non-standard English dialect such as working class English or Aboriginal English in view of the relationship with writing and language test scores previously found by the author (Daly, 2006). However, dialectal features such as surface grammar are social markers that have been referred to as 'linguistic triviality' (Gee, 1990, p. 149) and they do not often preclude understanding between the dialects, so a significant correlation was not expected.
In view of the sample limitations of previous research (Hart & Risley, 2003) the current research aimed to establish whether there would be a correlation across a range of older students including Indigenous and nonIndigenous students in metropolitan, provincial and remote schools. A positive correlation would also build on the research by Hay et al. (2003), which focused on young children in a single multicultural disadvantaged community.
The same question was asked separately about Indigenous and non-Indigenous students in view of the differences in their reading performance on large scale group tests. For example, in sample international tests for 15-year-old students, 'in reading literacy, mathematical literacy and scientific literacy in the Program for International Student Assessment (PISA) 2000, 2003 and 2006 respectively, Indigenous students in Australia performed more than 80 score points (or more than one proficiency level) lower than non-Indigenous students and more than 50 score points lower than the OECD average' (De Bortoli & Thomson, 2009).
When students are reflecting on the meaning of texts and making inferences in interviews, they might be expected to make statements that are explanatory, causal or conditional. Such statements tend to be grammatically complex having dependent clauses beginning with words such as because ... , if ... , when ... and would be at the more written-like end of the spoken language continuum.
Context for research
The research was conducted in Australia where, until 2008, there were annual tests of literacy and numeracy in each state. In the state of New South Wales (NSW) the tests were called the Basic Skills Test (BST) for students in Year 3 and Year 5 and English Language and Literacy Assessment (ELLA) in Year 7. The NSW reading tests were predominantly multiple choice questions about short stimulus texts in a colour magazine, which is similar to the current National Assessment Program for Literacy and Numeracy (NAPLAN) which replaced the BST and ELLA in 2008.
The texts that were used and analysed in the main ARC project and in this research were taken from the Year 3 and Year 5 NSW Basic Skills Tests (BST) reading materials and questions. Rasch analysis (Bond & Fox, 2001) of the BST data, which was routinely carried out as part of the NSW Department of Education and Training (DET) assessment procedures for state-wide tests, was made available to the author. The Rasch analysis placed the results on a common reporting scale across tests for students in Year 3 (aged 7 to 8) and Year 5 (aged 9 to 10 years). The common reporting scale was achieved through equating processes whereby sample populations sat both current and past tests and some questions were common to both cohorts. A common reporting scale makes it possible to identify both the difficulty of questions and the growth in student achievement from Year 3 to Year 5.
During the main ARC research, statistical analysis of mean Rasch item difficulty (Daly & Unsworth, 2011, p. 69) showed that amongst the items about the most complex image-language relations, most difficult questions required students to comprehend more complex language than the easier questions. For example, the most difficult Year 5 BST question assessing complementarity of meaning between images and words could not be answered unless a student had comprehended a complex sentence in written commentary about a painting. The structural complexity in this sentence, which follows, involved three dependent clauses, ellipsis of words and three instances of the passive voice. The sailfish is believed to be a cunning fish, able to feed amongst the various fish traps and nets shown by the dark areas, without being caught.
In contrast, the easiest Year 5 BST question assessing complementarity of meaning required students to make an inference about an image (Daly & Unsworth, 2011, pp. 69-71) by connecting it to the following dialogue, which only contained independent clauses:
Mum had bought some chocolate and Harry and I were starving.
'Can we have it now?' I asked.
'No, not yet,' said Dad.
'Why not?' whined Harry.
'Because,' said Dad.
'Because what?' I asked.
'Because I say so,' said Dad.
(from Zoo by Anthony Browne)
The context for the research establishes differential difficulty between reading test items assessing comprehension of text involving complex sentences and text involving simple or compound sentences. The analysis of the students' spoken language complexity was based on transcripts of interviews with the students about their comprehension of the BST texts using a 'thinkaloud' procedure. This procedure and the methods for analysing the spoken language are outlined in the methodology.
Methodology
The methodology provides details of the students in the research sample and then describes the 'thinkaloud' procedure used in the interviews. The methods used for measuring and analysing grammatical and lexical complexity in the students' spoken language are described next and, finally, the method of data analysis is outlined.
Research sample
During the research, 110 students in metropolitan, provincial and remote schools were interviewed about their comprehension of multimodal texts taken from the Basic Skills Test (BST) for Year 3 and Year 5. Students were interviewed when they were in Year 4 and Year 6, the year following the BST. There were similar numbers of boys and girls and similar numbers of Indigenous and non-Indigenous students among the 58 Year 4 students, then around 9 years old, and 52 Year 6 students, then around 11 years old.
The geographic location categories of metropolitan, provincial and remote are based on the MCEETYA (Ministerial Council on Education, Employment, Training and Youth Affairs) School Geographic Location Classification which was adopted in July 2001 for reporting nationally comparable schooling outcomes. The categories consider population and accessibility/ remoteness in relation to services (Lyons, Cooksey, Panizzon, Parnell & Pegg, 2006).
For the ability groupings reading scores were determined to be high if BST results were in the top reported band for reading, which is approximately the top quartile of the State. The top band for Year 3 was band 5 and the top band for Year 5 was band 6, because using the same reporting scale for both years entailed an extra band for the older cohort. The reading scores were considered medium if results were in the next two reported bands which approximated the middle 50% of the State and the scores were considered low if they were in lower bands for reading which approximated the bottom quartile of reading results for each cohort. The lowest category mostly represents students who are at or below the national minimum Benchmark standard.
All students in the classes selected were invited to participate in the study and the final range of ability of the students who accepted the invitation, as shown in Tables 1 and 2, was more representative of the local populations being studied than the proportions of the state achieving the band levels.
'Think-aloud' procedure
During the interviews students were asked to 'think aloud' about the meaning of words and images while reading the multimodal texts. Students then answered the multiple choice questions from the BST that involved comprehension of both image and language (the 'post-test') and identified how they decided on their answers. Following the protocol of Laing and Kamhi (2002), all of the 'think aloud' parts of the interview recordings were transcribed and divided into clauses. The clauses were then categorised as literal (repetition or paraphrase) or inferential (prediction, explanation or association) and were analysed for complexity of spoken language.
Analysis of spoken language--grammatical and lexical complexity
The students' spoken language was analysed using the similar measures of structural (grammatical) and lexical (vocabulary) complexity as those used to analyse the BST texts in the main ARC research.
For the purposes of this research structural complexity has been measured by counting instances of dependent clauses and identifying the percentage of dependent clauses used out of all clauses spoken. Although embedded defining relative clauses are not part of a clause complex (Halliday, 2004/1994), they have been counted as instances of dependent clauses because they cannot stand alone as independent clauses. Relative clauses lead to increased difficulty in comprehension by young students because they extend the length of the noun group.
Lexical complexity in written texts is usually associated with features of dense text such as nominalisations, but, as expected in a children's test, the BST had few examples of nominalisation. Nevertheless, words can be simple or more advanced, so it was decided to assess relative lexical complexity in language by the number of instances of non-core words (Carter, 1987, p. 33), as opposed to core words. Core words are generally seen to be the most basic or simple word choice. The test used for core and non-core words was by using substitution, for example,
in the lexical set, gobble, dine, devour, eat, stuff, gormandise each of the words could be
defined using 'eat' as a basic semantic feature but it would be inaccurate to define eat
by reference to any other of the words in the set (i.e. dine entails eat but eat does not entail dine) ... (Carter, 1987, p. 35).
The few nominalisations used by some students were naturally included in the non-core vocabulary measure.
Data analysis
Pearson correlations were conducted for the two cohorts interviewed (Year 4 and Year 6 students) among the following variables,
* original BST reading score (from the year before the interview across all questions)
* the post-test score (during the interview on questions involving multimodality)
* number of correct inferences and explanatory inferences (about words and images)
* combined spoken language complexity (number of dependent clauses plus number of non-core words)
If the previous correlations were significant, then, for all students in one cohort, for the Indigenous students in that cohort and for the non-Indigenous students in that cohort, further Pearson Correlations would be conducted among the following variables,
* percentage of correct answers in Year 5 BST Reading
* percentage of correct answers in Year 7 ELLA Reading
* percentage of dependent clauses used in Year 6 interviews
* number of non-core words used in Year 6 interviews
* number of instances of non-standard English used in Year 6 interviews
The cohort to be further analysed would be the one to produce more complexity in spoken language and this was expected to be the older cohort because language complexity develops as students mature. The fact that the reading texts in the Year 5 tests were more complex was also considered likely to elicit more complexity in the Year 6 students' spoken language.
Results
The findings from statistical analyses of the data from both the Year 4 and Year 6 samples are presented first. As the Year 6 students produced more examples of complex language, the next findings reported are the extra analyses of the percentage of dependent clauses and number of non-core words used by Indigenous Year 6 students and by the non-Indigenous Year 6 students. Finally, a few examples of complexity in the students' spoken language are presented.
Quantitative data from statistical analyses
Pearson Correlation statistics were significant (p < .01) for both Year 4 and Year 6, as shown in Tables 3 and 4, across the following variables,
* the combined grammatical and lexical complexity of spoken language
* the number of correct answers to all reading questions in the Basic Skills Test
* the number of correct answers to questions in the post-test during interviews
* correct inferences about verbal text
* explanatory inferences about verbal text
* correct inferences about images
* explanatory inferences about images
Pearson correlations were also significant (p < .01) for all Year 6 students (see Table 5), Year 6 Indigenous students (see Table 6) and Year 6 non-Indigenous students (see Table 7), when they were conducted across the following variables,
* percentage of dependent clauses spoken
* the number of non-core words spoken
* reading scores in the Year 5 BST
* reading scores in the Year 7 ELLA
However, no correlation was evident between any of these variables and the use of non-standard English for any of the three groupings of students.
Qualitative data--examples of spoken language
It was interesting that a Year 6 metropolitan male Indigenous student had the highest percentage of dependent clauses (38 out of a total of 114, being 33%) and a reading score in the top band for the Year 5 BST. In view of the constant focus on the gap between the mean performance of Indigenous and non-Indigenous students in Australia, this finding was a reminder that Indigenous students have equal ability to comprehend complex text and use complex spoken language. The following excerpts from his interview provide examples of explanatory inferences using complex sentences with adverbial, relative, embedded and non-finite clauses.
Student: ... a float that moves up and down (relative embedded) to move the larger, longer one. [non-finite]
Student: Because if you go around the place [adverbial] it takes longer, where if you just go straight [adverbial] it takes much less time.
Student: They are colours that have been used through the generations. [relative]
This student also had more complex vocabulary than any other student, using nineteen non-core words, which included, co-ordinates, capsized, indication, generations, hieroglyphs, evaporated, ochre and distinguish.
All of the eight students who used no dependent clauses had low BST reading scores. These students included non-Indigenous and metropolitan students as well as Indigenous students and students in provincial and remote schools. Many other students with low reading scores used only one or two dependent clauses. Even though they used fewer complex grammatical structures, some of these students made a large number of statements so a lack of understanding of the verbal text had not limited their vocal output.
It was also observed that during use of the 'Think Aloud' procedure some of the students were able to correctly answer questions for which they had selected incorrect answers during the BST reading the previous year.
Discussion
The significant correlations for Indigenous and for non-Indigenous students indicate that for both of these groups in the research sample there was a relationship between their reading comprehension and the degree of complexity in their spoken language. In view of this finding, and while keeping in mind the small sample size in this research, it is important to ask why the results are different to the state-wide gap in performance between Indigenous and non-Indigenous students. It is also important to consider whether the answer suggests a reason for the state-wide gap in performance.
It is relevant to note that by choosing to study similar numbers of Indigenous and non-Indigenous students, this study was not a representative sample of the state, where there are less than 3% Indigenous students in metropolitan schools, approximately 10% in provincial schools and 30% in remote schools. After the research was completed, another factor became apparent when the Index of Community Socio-educational Advantage (ICSEA) values became available. The ICSEA value for each school is based on Australian Bureau of Statistics data related to school characteristics such as location and characteristics of the students such as parental education and occupation. These factors impact on student and school performance in NAPLAN and it became apparent that most of the schools in the research sample had ICSEA values that were close to average, whereas the majority of Indigenous students in the state attend schools with below average ICSEA values.
Research by Torr (2008) concerning Indigenous literacy has found that mothers' education level is more relevant to their beliefs about literacy development than whether they are Indigenous or not. Torr showed that 'tertiary educated mothers, whether Indigenous or Anglo-Australian, held views that were mostly compatible with an emergent literacy perspective' and 'early school-leaving mothers focused more on the role of memory and repetition of specific skills in learning to read' (Torr, 2008, p. 65).
Similar findings have been made about the relevance of parents' level of education or professional status to student performance in the USA (Heath, 1982) and Australia (Williams, 1994; Najman, Aird, Bor, O'Callaghan, Williams & Shuttlewood, 2004). This could partly explain why it has been found in the National Assessment Program in Literacy and Numeracy (NAPLAN) that 'mean scores are higher for students whose parents have higher levels of education' (ACARA, 2013, p. 64, 128). The wider population differences in literacy achievement between Indigenous and non-Indigenous students could be due to generational differences in education levels evidenced by census data revealing that Indigenous children are more than twice as likely to have parents who left school early (Scougall, 2008). There could be a similar explanation for the mean differences in reading comprehension achievements between students in metropolitan, provincial and remote schools in view of the lower accessibility of tertiary education and the higher number of students leaving school early in country areas (Calma, 2005, p. 30).
The effect of parent education levels on student performance might also be related to differences between the vernacular language used by highly educated parents (elaborated codes) and the restricted codes used by less educated parents (Bernstein, 1974). Elaborated codes are closer to the written end of the language continuum than restricted codes and are more likely to include explanatory language with its concomitant use of dependent clauses. When considering that the higher reading scores relied on inferential comprehension of more complex written text, the correlations between reading comprehension scores and spoken language complexity suggest that developing complexity in spoken language could play a role in developing inferential comprehension of complex written text. Inferential comprehension requires reasoning and Mercer, Wegerif and Dawes (1999) 'confirm the value of explicitly teaching children how to use language to reason' (p. 95) through exploratory talk.
In the current climate where national tests in reading and writing have led to an earlier focus on these skills, sometimes at the expense of speaking and listening, it is important to remember that oral language is the foundation on which literacy is built (Hay & Fielding-Barnsley, 2009, p. 158). The findings from the current research are a timely reminder for teachers of the need to recognise that students from a variety of backgrounds could have a range of complexity in their spoken language. The findings also highlight the need for teachers to consider a focus on strategies to develop grammatical complexity as well as vocabulary in students' spoken language.
Conclusions and recommendations
The findings from this study are highly relevant to the research and teaching community even though they cannot be extrapolated to the whole population in view of the small sample size. As much of the previous research relating spoken language to reading comprehension concerned vocabulary and/or very young students, this study's finding about the correlation between grammatical complexity in spoken language and reading comprehension in older students is highly relevant. The importance of these findings is highlighted by recent research establishing that a focus on developing the spoken language of students with low literacy skills can assist the students to improve their reading comprehension and general ability to learn (Munro, 2011).
John Munro achieved improved student outcomes at Bellfield Primary School in Melbourne when he put a 'focus on the students' vocabulary and the ability to talk about ideas in sentences, and to ask and answer questions as a means for building new knowledge' (Munro, 2011, p. vii). Strategies to develop the use of complex sentences are included within the grammar conventions aspect of his ICPALER model of communication. ICPALER represents Ideas communicated, the Conventions used to do this, the Purpose for communicating and the Ability to Learn, when Expressing and Receiving ideas. Accelerated literacy (Gray, 2007) is another program that develops students' spoken language while teaching them to read by having them talk out the meaning in texts.
Further research in the classroom is recommended to see which methods are most successful in teaching students to say and use more complex sentence meanings as well as leading to improvements in their reading comprehension. Such research could include Munro's Oral Language Screening Profiles (Munro 2011, pp. 230-241) to assess expression and reception of sentence conventions and meanings. The profiles are also an excellent resource for teachers to ascertain the oral language development of their students.
It would be beneficial to investigate the relationship between cognitive development, structural complexity in spoken language and comprehension of complex text in view of the significant correlations between reading, complex spoken language and the number of inferences produced. Although it would be difficult to prove any causal link, a mutual developmental relation among these skills is highly likely, in view of the reliance on dependent clauses to successfully link ideas when expressing explanatory and causal inferences.
The use of a 'think aloud' procedure has been found to improve inferential comprehension (Laing & Kamhi, 2002) and it seemed to assist some students during the post-test, but the one-on-one situation is not easily transferred to the classroom. However, aspects of the procedure might be successfully implemented in classrooms in other ways. For example, relevant talk about texts can be scaffolded with some additional personnel as in the Parents as Partners program studied by Hill and Diamond (2013) in a school where, 'The teachers and parents engaged in the play activities to introduce new vocabulary, read books and used specific prompts to extend children's oral language' (Hill & Diamond, 2013, p. 52).
The findings have shown that, regardless of whether students were Indigenous or not, those who were experiencing difficulty with reading comprehension did not use verbal complexity when speaking nor understand it when reading. It is possible that these students might have experienced more restricted linguistic input prior to school but further research into the spoken language of 'at risk' students and their families is needed to draw further conclusions. This research suggests that teachers need to focus on speaking and listening skills, not just with young students but also with older students experiencing reading comprehension difficulties. By assessing the oral language of students and focusing on strategies to develop spoken language where needed, teachers might be able to assist educationally disadvantaged students to improve their reading comprehension.
Ann Daly
University of New England
Acknowledgements
Permission from Professor Len Unsworth to conduct this research into spoken language within a research project concerning comprehension of multimodal text is greatly appreciated and gratefully acknowledged.
References
Australian Curriculum and Reporting Authority (ACARA) (2013). National Assessment Program Literacy and Numeracy National Report for 2013. Retrieved from http:// www.nap.edu.au/verve/_resources/NAPLAN_2013_ National_Report.PDF
Bernstein, B. (1974). Class, codes and control: Volume 1. Theoretical studies towards a sociology of language, second revised edition. London, UK: Routledge and Kegan Paul.
Beron, K. & Farkas, G. (2004). Oral language and reading success: A structural equation modelling approach. Structural Equation Modeling, 11 (1), 110-131.
Biemiller, A. (2003). Oral comprehension sets the ceiling on reading comprehension. American Educator: Spring 2003.
Bomer, R., Dworin, J., May, L. & Semingson, P. (2009). What's wrong with a deficit perspective? Teachers College Record, June 2009 http://www.tcrecord.org ID No: 15648.
Bond, T.G. & Fox, C.M. (2001). Applying the Rasch model: Fundamental measurement in the human sciences. Mahwah, NJ: Lawrence Erlbaum.
Burkhour, H. (1999). The relationship between vocabulary and reading comprehension in junior high aged students with learning disabilities. Masters Theses, Paper 534. Graduate Research and Creative Practice, Grand Valley State University.
Calma, T. (2005). Face the facts. Sydney: Human Rights and Equal Opportunity Commission.
Carter, R. (1987). Vocabulary: Applied linguistic perspectives. London, UK: Allen & Unwin.
Chall, J. & Curtis, M. (1991). Responding to individual differences among language learners: Children at risk. In J. Jensen, D. Lapp & J. Squire (Eds.), Handbook of research on teaching the English language arts, (pp. 349-355). New York: MacMillan.
Daly, A. (2011). Aboriginal and rural students' comprehension and talk about image-language relations in reading tests. PhD thesis, University of New England, Armidale.
Daly, A. (2006). Assessing the literacy needs of students who speak Aboriginal English. Paper presented at Sydney University, National Conference on Future Directions in Literacy, Sydney, Australia. Retrieved March 2011 (but no longer available) from www.edsw.usyd.edu.au/schools_ teachers/prof_dev/resources/Lit_proceedings.pdf
Daly, A. & Unsworth, L. (2011). Analysis and comprehension of multimodal texts. Australian Journal of Language and Literacy, 34 (1), 61-80.
De Bortoli, L. & Thomson, S. (2009). The Achievement of Australia's Indigenous students in PISA 2000-2006. Camberwell, Victoria: Australian Council for Educational Research.
Evans, J. (1994). Oral language developmental continuum. Education Department of Western Australia. Melbourne: Longman.
Gee, J. (1990). Social linguistics and literacies: ideology in discourses. Brighton, UK: Falmer Press.
Goodman, Y. (1990). The development of initial literacy. In R. Carter (Ed.), Knowledge about language and the curriculum: the LINC reader. London: Hodder and Stoughton Ltd.
Gray, B. (2007). Accelerating the literacy development of Indigenous students. Charles Darwin University, Darwin, NT: Uniprint NT.
Halliday, M.A.K. (2004/1994). An introduction to Functional Grammar. Third Edition, revised by Christian Matthiessen. London, UK: Arnold.
Hart, B. & Risley, R. (1995). Meaningful differences in the everyday experience of young American children. Baltimore: Paul H. Brookes.
Hay, I., Elias, G., Homel, R., Freiberg, K., Ernst, R. & Jensen, H. (2003). Occurrence of language difficulties in children and the effectiveness of a language intervention program. In B. Bartlett, F. Bryer & D. Roebuck (Eds.), Reimagining Practice Researching Change (Vol. 2), 41-54. Brisbane: Griffith University.
Hay, I. & Fielding-Barnsley, R. (2009). Competencies that underpin children's transition into early literacy. Australian Journal of Language and Literacy, 32 (2), 148-162.
Heath, S. (1982). What no bedtime story means: narratives at home and school. Language in Society, 11, 49-78.
Hill, S. & Diamond, A. (2013). Family literacy in response to local contexts. Australian Journal of Language and Literacy, 36 (1), 48-55.
Hill, S. & Launder, N. (2010). Oral language and beginning to read. Australian Journal of Language and Literacy, 33 (3), 240-254.
Jones, P. (1996). Planning an oral language program. In P. Jones (Ed.), Talking to learn. Newtown, Australia: Primary English Teaching Association.
Laing, S. & Kamhi, A. (2002). The use of think-aloud protocols to compare inferencing abilities in average and belowaverage readers. Journal of Learning Disabilities, 35 (5), 436-447, Academic Research Library.
Lyons, T., Cooksey, R., Panizzon, D., Parnell, A. & Pegg, J. (Eds.) (2006). SiMERR (National Centre of Science, Information and Communication Technology and Mathematics
Education for Rural and Regional Australia) National Survey, SiMERR National Centre, University of New England, Armidale, NSW. Accessed at http://simerr.une. edu.au/pages/projects/1nationalsurvey/Abridged%20 report/Abridged_Full.pdf
Martin, J.R. (1985). Language, register and genre. In F. Christie (Ed.), Children writing course reader. Geelong, Victoria: Deakin University Press.
Martohardjono, G., Otheguy, R., Gabriele, A., de Goeas-Malone, M., Szupica-Pyrzanowski, M., Troseth, E., Rivero, S. and Schutzman, Z. (2005). The role of Syntax in reading comprehension: A study of bilingual learners. Proceedings of the 4th International Symposium on Bilingualism, Cascadilla Press, Somerville, MA, USA.
Mercer, N., Wegerif, R. & Dawes, L. (1999). Children's talk and the development of reasoning in the classroom. British Educational Research Journal, 25 (1), 95-111.
Munro, J. (2011). Teaching Oral Language: Building a firm foundation using ICPALER in the early primary years. Camberwell, Victoria: ACER Press.
Najman, J., Aird, R., Bor, W., O'Callaghan, M., Williams, G. & Shuttlewood, G. (2004). The generational transmission of socioeconomic inequalities in child cognitive development and emotional health. Social Science and Medicine, 58, 1147-1158.
Reza, Kafipour., (2010). Vocabulary learning strategies, vocabulary knowledge and reading comprehension of EFL undergraduate students in Iran. PhD thesis, Universiti Putra Malaysia.
Scougall, J. (2008). Lessons learnt about strengthening Indigenous families and communities. Occasional Paper No. 19, FACS. Australian Government Department of Families, Housing, Community Services and Indigenous Affairs.
Scull, J. (2010). Embedding comprehension within reading acquisition processes. Australian Journal of Language and Literacy, 33 (2), 87-107.
Torr, J. (2008). Mothers' beliefs about literacy development: Indigenous and Anglo- Australian mothers from different educational backgrounds. The Alberta Journal of Educational Research, 54 (1), 65-82.
Unsworth, L. & Chan, E. (2008). Assessing integrative reading of images and text in group reading comprehension tests. Curriculum Perspectives, 28 (3), 71-76.
Williams, G. (1994). Joint book-reading and literacy pedagogy: A socio-semantic examination. PhD thesis, School of English and Linguistics, Macquarie University.
Ann Daly taught in primary schools for ten years and worked in literacy assessment for ten years. She is currently conducting evaluations of education programs. Ann's PhD research concerned the spoken language and comprehension of multimodal texts by Aboriginal and non-Aboriginal students in metropolitan and rural schools in NSW. Table 1. Number of students interviewed in Year 4 Year 4 Indigenous (28) Year 3 BST Results Low * Medium * High * Metropolitan 5 3 1 Provincial 5 2 1 Remote 3 7 1 13 12 3 Year 4 Non-Indigenous (30) Year 3 BST Results Low Medium High 26 boys 32 girls Metropolitan 4 4 3 20 Provincial 3 4 3 18 Remote 4 3 2 20 11 11 8 58 * High = results in the top reporting band (Band 5 for Year 3) approximately top quartile Medium = results in the next two reporting bands (Bands 3 and 4 for Year 3) Low = results in the lower bands (Bands 1 and 2 for Year 3) approximately bottom quartile Table 2. Number of students interviewed in Year 6 Year 6 Indigenous (24) Year 5 BST Results * Low * Medium * High * Metropolitan 2 3 2 Provincial 5 5 0 Remote 2 5 0 9 13 2 Year 6 Non-Indigenous (28) Year 5 BST Results * Low Medium High 27 boys 25 girls Metropolitan 5 4 4 20 Provincial 1 3 4 18 Remote 3 3 1 14 9 10 9 52 * High = results in the top reporting band (Band 6 for Year 5) approximately top quartile Medium = results in the next two reporting bands (Bands 4 and 5 for Year 5) Low = results in the lower bands (Bands 1, 2 and 3 for Year 5) approximately bottom quartile Table 3. Summary of correlation results (Year 4 linguistic complexity/ scores/ inferences) Interviews Year Year 4 Correct at time of 3 BST post-test inference post-test reading score about verbal score text Linguistic .498 ** .496 ** .486 ** complexity Pearson's r Significance .000 .000 .000 (2-tailed) N 58 58 58 Interviews Explan. Correct Explan. Linguistic at time of Inference inference Inference complexity post-test about verbal about about of talk text images images Linguistic .368** .788 ** .754 ** complexity Pearson's r Significance .005 .000 .000 1 (2-tailed) N 58 58 58 58 ** Correlation is significant at the 0.01 level (2-tailed) Table 4. Summary of correlation results (Year 6 linguistic complexity/ scores/ inferences) Interviews Year 5 BST Year 6 Correct at time of reading post-test inference post-test score score about verbal text Linguistic complexity .478 ** .416 ** .732 ** Pearson's r Significance .000 .002 .000 (2-tailed) N 52 52 52 Interviews Explan. Correct Explan. Linguistic at time of inference inference inference complexity post-test about verbal about about of talk text images images Linguistic complexity .599** .581 ** .577 ** 1 Pearson's r Significance .000 .000 .000 52 (2-tailed) N 52 52 52 ** Correlation is significant at the 0.01 level (2-tailed) Table 5. Pearson correlation for All Year 6 among % correct BST and ELLA scores and number of non-core words, non-standard grammar and % of dependent clauses used in interviews All Year 6 % Correct % Correct BST Year 5 ELLA Year 7 % Correct Pearson Correlation 1 .881 (**) BST Year 5 Sig. (2-tailed) .000 N 52 47 % Correct ELLA Pearson Correlation .881 (**) 1 Year 7 Sig. (2-tailed) .000 N 47 47 Year 6 % Pearson Correlation .689 (**) .694 (**) Dependent Clauses Sig. (2-tailed) .000 .000 N 52 47 Year 6 Pearson Correlation .581 (**) .612 (**) Non-core words Sig. (2-tailed) .000 .000 N 52 47 Year 6 Pearson Correlation -.327 (*) -.295 (*) Nonstandard grammar Sig. (2-tailed) .018 .044 N 52 47 All Year 6 Year 6 % Year 6 Dependent Noncore clauses words % Correct Pearson Correlation .689 (**) .581 (**) BST Year 5 Sig. (2-tailed) .000 .000 N 52 52 % Correct ELLA Pearson Correlation .694 (**) .612 (**) Year 7 Sig. (2-tailed) .000 .000 N 47 47 Year 6 % Pearson Correlation 1 .656 (**) Dependent Clauses Sig. (2-tailed) .000 N 52 52 Year 6 Pearson Correlation .656 (**) 1 Non-core words Sig. (2-tailed) .000 N 52 52 Year 6 Pearson Correlation -.175 -.192 Nonstandard grammar Sig. (2-tailed) .215 .174 N 52 52 All Year 6 Year 6 Nonstandard grammar % Correct Pearson Correlation -.327 (*) BST Year 5 Sig. (2-tailed) .018 N 52 % Correct ELLA Pearson Correlation -.295 (*) Year 7 Sig. (2-tailed) .044 N 47 Year 6 % Pearson Correlation -.175 Dependent Clauses Sig. (2-tailed) .215 N 52 Year 6 Pearson Correlation -.192 Non-core words Sig. (2-tailed) .174 N 52 Year 6 Pearson Correlation 1 Nonstandard grammar Sig. (2-tailed) N 52 ** Correlation is significant at the 0.01 level (2-tailed). * Correlation is significant at the 0.05 level (2-tailed). Table 6. Pearson correlation for Year 6 Indigenous students among % correct BST and ELLA scores and number of non-core words, non-standard grammar and % of dependent clauses used in interviews Year 6 Indigenous % Correct 05 % Correct 07 % Correct 05 Pearson Correlation 1 .865 (**) Sig. (2-tailed) .000 N 24 22 % Correct 07 Pearson Correlation .865 (**) 1 Sig. (2-tailed) .000 N 22 22 % Dependent Pearson Correlation .705 (**) .653 (**) Clauses Sig. (2-tailed) .000 .001 N 24 22 Non-core words Pearson Correlation .632 (**) .630 (**) Sig. (2-tailed) .001 .002 N 24 22 Non-standard Pearson Correlation -.327 -.314 grammar Sig. (2-tailed) .119 .155 N 24 22 Year 6 Indigenous % Dependent Non-core clauses words % Correct 05 Pearson Correlation .705 (**) .632 (**) Sig. (2-tailed) .000 .001 N 24 24 % Correct 07 Pearson Correlation .653 (**) .630 (**) Sig. (2-tailed) .001 .002 N 22 22 % Dependent Pearson Correlation 1 .704 (**) Clauses Sig. (2-tailed) .000 N 24 24 Non-core words Pearson Correlation .704 (**) 1 Sig. (2-tailed) .000 N 24 24 Non-standard Pearson Correlation -.290 -.353 grammar Sig. (2-tailed) .169 .091 N 24 24 Year 6 Indigenous Non- standard grammar % Correct 05 Pearson Correlation -.327 Sig. (2-tailed) .119 N 24 % Correct 07 Pearson Correlation -.314 Sig. (2-tailed) .155 N 22 % Dependent Pearson Correlation -.290 Clauses Sig. (2-tailed) .169 N 24 Non-core words Pearson Correlation -.353 Sig. (2-tailed) .091 N 24 Non-standard Pearson Correlation 1 grammar Sig. (2-tailed) N 24 ** Correlation is significant at the 0.01 level (2-tailed). * Correlation is significant at the 0.05 level (2-tailed). Table 7. Pearson correlation for Year 6 non-Indigenous students among % correct BST and ELLA scores and number of non-core words, non-standard grammar and % of dependent clauses used in interviews Year 6 Non-Indigenous % Correct 05 % Correct 07 % Correct 05 Pearson Correlation 1 .901 (**) Sig. (2-tailed) .000 N 28 25 % Correct 07 Pearson Correlation .901 (**) 1 Sig. (2-tailed) .000 N 25 25 % Dependent Pearson Correlation .690 (**) .728 (**) Clauses Sig. (2-tailed) .000 .000 N 28 25 Non-core words Pearson Correlation .545 (**) .633 (**) Sig. (2-tailed) .003 .001 N 28 25 Non-standard Pearson Correlation -.305 -.237 grammar Sig. (2-tailed) .115 .253 N 28 25 Year 6 Non-Indigenous % Dependent Non-core clauses words % Correct 05 Pearson Correlation .690 (**) .545 (**) Sig. (2-tailed) .000 .003 N 28 28 % Correct 07 Pearson Correlation .728 (**) .633 (**) Sig. (2-tailed) .000 .001 N 25 25 % Dependent Pearson Correlation 1 .610 (**) Clauses Sig. (2-tailed) .001 N 28 28 Non-core words Pearson Correlation .610 (**) 1 Sig. (2-tailed) .001 N 28 28 Non-standard Pearson Correlation -.022 -.058 grammar Sig. (2-tailed) .912 .770 N 28 28 Non- Year 6 Non-Indigenous standard grammar -.305 % Correct 05 Pearson Correlation .115 Sig. (2-tailed) 28 N -.237 % Correct 07 Pearson Correlation .253 Sig. (2-tailed) 25 N -.022 % Dependent Pearson Correlation Clauses .912 Sig. (2-tailed) 28 N -.058 Non-core words Pearson Correlation .770 Sig. (2-tailed) 28 N 1 Non-standard Pearson Correlation grammar Sig. (2-tailed) 28 N ** Correlation is significant at the 0.01 level (2-tailed). * Correlation is significant at the 0.05 level (2-tailed). Figure 1. Spoken Language Cont informal small group reporting back face-to-face problem-solving on a task chat tasks most spoken-like language class show and tell accompanying discussions action (adapted from Martin 1985) informal small group reporting back face-to-face problem-solving on a task chill tasks most spoken-like language Class show and tell accompanying discussions action informal newstime spoken reading aloud face-to-face information chat reports most written-like language language as accompanying reflection action (adapted from Martin 1985) informal newstime spoken reading aloud face-to-face information chill reports most written-like language language as accompanying reflection action (adapted from Martin 1985)