School socioeconomic status and student outcomes in reading and mathematics: a comparison of Australia and Canada.
Perry, Laura B. ; McConney, Andrew
Abstract
Previous research has established that student outcomes are
strongly associated with the socioeconomic composition of a school, also
known as school socioeconomic status. Less is known, however, about the
ways in which the relationship varies for different students, schools
and national education systems. Here, we conduct a secondary analysis of
an international dataset to examine the strength of the relationship
between school socioeconomic status and achievement in math and reading
for Canada and Australia. The history, economy and culture of these two
countries are similar, as are many aspects of their education systems.
One important difference, however, is the degree to which their
education systems are marketised. Our findings show that in both
countries, school socioeconomic status is strongly associated with
academic achievement for all students, regardless of their individual
socioeconomic status. Nevertheless, the relationship between school
socioeconomic status and academic achievement is substantially stronger
in Australia than in Canada. We conclude that student outcomes are more
equitable in Canada than in Australia, and suggest that this may be due
to differences in the ways in which the two education systems are funded
and structured.
Keywords
Socioeconomic status, peer influence, school demography, academic
achievement, comparative analysis
Introduction
The relationship between social background and educational outcomes
is well-established. Typically, the relationship is strong and positive,
wherein higher socioeconomic status (SES) is associated with better
educational outcomes. Students from privileged social backgrounds have
on average higher test scores (Organisation for Economic Co-operation
and Development [OECD], 2007a; Perie, Moran, & Lutkus, 2005), are
more likely to complete secondary education (Renzulli & Park, 2000)
and are more likely to attend university (Blossfeld & Shavit, 1993;
Connor & Dewson, 2001; Lee, 1999; Terenzini, Cabrera, & Bernal,
2001) compared to their less-privileged peers. These relationships
persist after controlling for students' prior ability. For example,
data from the US Department of Education shows that 78% of high
achieving/low SES students attended university in 1992 compared to 97%
of high achieving/high SES students (Lee, 1999). The relationship
between students' social backgrounds and their educational outcomes
exists in all societies, although the strength of the relationship
varies from very strong to moderate (OECD, 2004, 2007a). Nonetheless, on
average, social background is a strong predictor of students'
educational outcomes in all countries.
Additionally, the social background of school peers is also
associated with students' educational outcomes. The overall
socioeconomic composition of a school (mean school SES) is positively
related to a range of educational outcomes beyond students' own
social backgrounds (Palardy, 2008; Perry & McConney, 2010a, 2010b;
Rumberger & Palardy, 2005; Southworth, 2010; Sui-Chu & Willms,
1996). On average, a student who attends a higher SES school enjoys
higher educational outcomes compared to a student from a similar social
background who attends a lower SES school. In many countries, academic
performance is even more strongly associated with school SES than with a
student's individual SES (OECD, 2004, 2007a; Sirin, 2005). While
the reasons behind the association are varied, complex and not fully
understood, it is likely that higher SES schools are better positioned
to provide productive and stimulating learning environments compared to
other schools (Willms, 2010).
While the research literature has shown conclusively that school
SES is positively associated with student outcomes, a number of
questions remain unanswered. For example, to what extent is the
relationship different for lower SES students than for their more
advantaged peers? Additionally, to what extent is the relationship
between school SES and student outcomes uniform, wherein increases in
school SES are consistently associated with increases in student
achievement? Does the relationship weaken (or strengthen) as school SES
increases? Do these relationships among school SES, student SES and
academic performance vary cross-nationally, and if so, why?
In a recent series of studies, Perry and McConney have examined
these finer-grained relationships for Australia (McConney & Perry,
2010; Perry & McConney, 2010a, 2010b). They have shown that in
Australia, the relationship between school SES and student outcomes is
strong for all students regardless of their own social backgrounds. They
have also shown that the relationship between school SES and student
achievement strengthens as the SES of the school increases. Put another
way, achievement differences between students in low and middle SES
schools tend to be smaller than achievement differences between students
in middle and high SES schools. In this study, we extend those analyses
to Canada, a country that is similar culturally and economically to
Australia and whose educational system is considered both equitable and
high performing (OECD, 2007a). For the current study, our first two
research questions, focused on Canada, are:
(1) To what extent is the association between school SES and
student achievement consistent for all students, regardless of their
individual SES?
(2) To what extent is the association between school SES and
achievement similar for students in low SES schools as compared to their
peers in high SES schools?
As this is a comparative study, we also compare our analyses for
Canada with previous findings for Australia. Our third research question
therefore is:
(3) How do the findings from research questions 1 and 2 differ
between Canada and Australia?
Our purpose in this paper, therefore, is to examine the extent to
which the finer-grained details of the relationship between school SES
and student achievement vary between Canada and Australia. A comparison
of the two educational systems is theoretically significant because they
differ in an important regard--the degree to which they are marketised
(more detail about this is provided later in the paper). Our method of
analysis does not allow us to explain variations in the relationship
across the two countries, nor does it allow us to examine how the
relationship may be mediated by marketisation. We are nonetheless
interested in offering some preliminary insight about how the
relationship between school SES and academic achievement varies between
educational systems with varying degrees of marketisation. Recent
cross-national research by Alegre and Ferrer (2010) has suggested that
educational marketisation increases socioeconomic segregation between
schools, which has a negative impact on educational equity.
Background
The groundbreaking Coleman Report was one of the first studies to
examine the influence of school peers on student achievement. The report
found that academic achievement was more strongly associated with the
ethnic and social composition of a school's student body than with
its resources or facilities (Coleman et al., 1966). The report also
found that attending a racially and socioeconomically desegregated
school raised the achievement of working class African--American
students without lowering the achievement of their white, middle-class
peers. Later studies have confirmed that attending a socially segregated
school that enrols primarily students from low socioeconomic backgrounds
negatively impacts students' educational outcomes (Dronkers &
Levels, 2006; McConney & Perry, 2010; Orfield & Yun, 1999; Perry
& McConney, 2010a, 2010b; Robertson & Symons, 2003; Willms,
1999).
While the research literature strongly suggests that attending a
low SES school is associated with lower educational outcomes for all
students, much less is known about how the relationship between school
SES and educational outcomes can vary for students depending on their
own (family) social background. Some studies suggest that the
association between academic achievement and school socioeconomic
composition is stronger for lower SES students than their higher SES
peers (Kahlenberg, 2001; McPherson & Willms, 1987; Robertson &
Symons, 2003; Zimmer & Toma, 2000), while others suggest that the
association is similar for all students (OECD, 2004, 2007a; Perry &
McConney, 2010a, 2010b; Rumberger & Palardy, 2005; Sui-Chu &
Willms, 1996). However, only Perry and McConney within the last group of
studies have explicitly compared the strength of the association for low
and high SES students, leaving open some possibility that the
relationship is in fact not equally strong for all students.
The series of studies by McConney and Perry does suggest that the
relationship is similarly strong for students across all SES groups in
Australia, but further research is needed to determine if this is true
for other countries. Indeed, little is known about how the relationship
between school SES and student outcomes varies cross-nationally. The
OECD reports (2004, 2005, 2007a, 2007b) have been the first to
systematically examine cross-national differences by comparing the
amount of variation in student performance on the Programme for
International Student Assessment (PISA) explained by the SES of the
student, as well as the SES of the school. The reports have shown that
school SES has a larger effect on student performance than does student
SES in almost all participating countries (OECD, 2007a). In Australia
and Canada, the effect size on student performance is twice as large for
school SES as it is for student SES. For these two countries, one-half
of a standard deviation in the economic, social and cultural status
(ESCS, PISA's composite measure of SES) index at the school level
is associated with a score difference of 23 (Canada) to 29 (Australia)
points, while a similar difference at the student level is associated
with a score difference of 15 points. The relationship between school
SES and student performance on PISA is particularly strong in many
continental European countries (e.g., Netherlands, Germany and Austria),
moderately strong in English-speaking countries (including Canada and
Australia) and weakest in the countries of northern Europe (e.g.,
Finland and Norway). For example, one-half of a standard deviation in
school SES is associated with a difference of only 5 score points in
Finland. In the Netherlands, by contrast, the comparable score point
difference for school SES is 62 points.
While the OECD's PISA reports have been able to show for the
first time how the relationship between school SES and student outcomes
varies cross-nationally, they have done so in broad brush strokes. The
reports show the strength of the relationship, on average, for all
students in a particular country. They do not, however, examine
cross-national differences in the relationship between school SES and
student outcomes for different groups of students (e.g., low or high SES
students). We do not know whether the relationship between school SES
and student performance is stronger for low SES students in only some
countries, for example. Second, little is known about how the strength
of the relationship between school SES and student performance varies
across school SES contexts in different countries. For example, Perry
and McConney (2010b) found that the relationship between school SES and
student performance in Australia is particularly strong in higher SES
school contexts. The average achievement difference between low and
middle SES schools in Australia is substantially smaller than the
achievement difference between middle and high SES schools. In other
countries, the relationship may be different; for example, the
relationship may be equally strong in lower SES as it is in higher SES
school contexts. Similarly, the relationship may be different for high
SES students than for their less privileged peers in some countries.
Understanding these finer-grained aspects of the relationship between
school SES and student outcomes, particularly in cross-national
analyses, may provide insights about sustainable and realistic ways to
improve educational equity through reducing the influence of where
students attend school and of their social backgrounds.
Australian and Canadian contexts
As previously noted, analyses of the relationship between school
SES and student performance have previously been examined for Australia.
In this paper, we chose to use the same PISA 2006 dataset to compare our
findings for Australian students with those for students in Canada, a
country whose educational system is considered by researchers and
policymakers to be both high performing and highly equitable. Barry
McGaw, one of the architects of the PISA and currently Chair of the
Board of the Australian Curriculum, Assessment and Reporting Authority,
has noted that Australia should look to Canada for insight about how to
improve the performance and equity of its educational system (McGaw,
2010). This paper responds to McGaw's call by comparing the role of
school SES, one of the strongest predictors of student achievement,
across the two countries.
Comparing findings from Australia and Canada is meaningful because
the two countries are similar in many ways. Both are very large
geographically but have relatively small populations. Additionally, both
have a history of British colonial rule, have similar resource-based
economies, and are immigrant countries with similar demographic
profiles. Both countries attract highly educated immigrants, and
differences in educational outcomes between immigrant and non-immigrant
students are often very small and sometimes even nonexistent (OECD,
2010). Both countries have similar levels of poverty and income
inequality (CIA, 2011). In comparing the relationship between
educational achievement and school SES in Australia and Canada, we are
comparing "apples with apples," not "apples with
oranges." Comparing "like" countries allows researchers
to control, albeit crudely, contextual factors that can obscure
conclusions about educational phenomena and relationships. For example,
Finland and Korea both do well on PISA but have very different
socio-cultural, historical and educational contexts in comparison to
Australia. What works in Finland, a small and ethnically homogenous country, may not be replicable in Australia with its immense geography
and ethnically diverse population.
The educational systems of Australia and Canada, however, are very
similar. Both countries have a comprehensive system of secondary
education wherein the great majority of students attend the same type of
secondary school, such as "high school" or "senior high
school." Common among many English-speaking countries in the OECD,
the educational philosophy of both the Australian and Canadian systems
is based predominantly on the pedagogical paradigms of progressivism and
constructivism. The states and provinces of each country have the main
control over educational funding and decision making, although Australia
has adopted national standardised assessment since 2009 and is in the
process of implementing a national curriculum.
One important difference between the two educational systems is the
level of marketisation--i.e., privatisation and school choice--evident
in the two systems. Marketisation is the process by which educational
systems are organised around the market principals of choice and
competition. This typically results in the following features:
devolution of decision making, autonomy and accountability to individual
schools; increased diversity of education providers, particularly
non-governmental schools and increased school choice (Whitty &
Power, 2000). For our purposes here, the two main features of
marketisation that differentiate the Australian and Canadian systems of
education are school choice and privatisation. Australia has a large
private education sector with about one-third of all school-aged
students and 38% of secondary students attending private schools
(Australian Bureau of Statistics, 2006). While private schooling has
historically played a large role in Australia, the proportion of
students attending private schools has been growing over the last 30
years. The growth of the private sector has been attributed to federal
funding policies that have made private schooling more attractive for
many families (Ryan & Watson, 2004). By contrast, the private sector
in Canada is much smaller, with approximately 6% of students attending a
non-government school (Phillips, Raham, & Wagner, 2004). In terms of
school choice, a much larger proportion of Canadian students attend a
school where local residence is the main criterion for admittance, in
comparison to Australia. In the international dataset that we use for
this analysis, approximately 78% of students in Canada attend a school
where local residence is the main criterion for admittance, compared to
42% in Australia. While school choice is not uncommon in Canada, it is
much less prevalent than in Australia.
Method
To answer our research questions, we conducted retrospective,
secondary analyses of the Canadian and Australian datasets from the 2006
round of the PISA. PISA is an international standardised assessment of
the literacy performance of 15-year-old students in reading, mathematics
and science developed by the OECD and administered on a 3-year cycle
beginning in 2000. Each assessment round includes all three subject
domains and also assesses one of these in greater detail; for example,
the focus area for 2006 was science. All OECD member countries
participate in PISA, as well as many non-member partner countries.
Member countries comprise the most economically developed nations,
including Australia and Canada. In the 2006 assessment round, nearly
400,000 students from 30 member countries and 27 non-member countries
participated (OECD, 2007a).
Different from other large-scale assessments, PISA is based on
holistic characterisations of discipline-specific literacies--skills and
knowledge deemed necessary for personal and working life in
industrialised countries in a 21st century global economy (OECD, 2004).
In other words, PISA assesses students' performance in solving
everyday problems (literacies in reading, mathematics and science)
rather than achievement related to a particular curriculum.
Students' literacy scores are aggregated to allow the reporting of
national averages, and proportions of students achieving at each
proficiency level are also reported for each country.
In addition to assessing students' literacy across three
domains, PISA asks students and school principals a large number of
questions on issues potentially related to student performance. These
include student characteristics such as gender, immigrant background,
ethnicity and, most importantly for this study, a rich measure of SES
that PISA terms economic, social and cultural status (ESCS). ESCS
reflects information from students along three dimensions: parental
educational attainment, parental occupation and cultural and financial
resources available to the student's family. This last dimension is
particularly comprehensive, including questions about the number of
books and computers in the home, whether the family owns original
artworks or a piano as well as the frequency of visiting museums and art
galleries, among others. For PISA 2006, the Australian sample had a mean
ESCS of 0.19 (standard deviation [SD] = 0.78) and ranged from a low of
-3.90 to a high of 2.54. The equivalent measures for the Canadian sample
were a mean ESCS of 0.29 (SD = 0.81) ranging from a low of -4.37 to a
high of 2.75.
In PISA, each country's sample is drawn to be statistically
representative of the total number of students enrolled in different
types of schools (e.g., private or public, college preparatory or
vocational schools, etc.) and locations (e.g., urban or rural). For
2006, the Australian sample included 356 schools and over 14,000
students; the Canadian dataset comprised 896 schools and just over
22,000 students. Participating schools use equal probability sampling to
select 35 students (the so-called "target cluster size").
However, all 15-year-old students are sampled in small schools (defined
as having between 17 and 35 15-year-old students) and very small schools
(less than 17 15-year-old students). Small and very small schools are
included to help ensure that country samples are demonstrably representative (OECD, 2009). The sample statistics generated from this
dataset are therefore reflective of the two populations of 15-year-old
students, and subgroups within those populations.
It should also be noted, however, that PISA employs a two-stage
sampling frame by which schools are first sampled and then students
sampled within participating schools. This approach means that sampling
weights are associated with each student in the dataset because students
and schools may not have the same probability of selection within any
given country, and some within-country strata are over-sampled to meet
national reporting priorities (OECD, 2009). Such a sampling design has
the potential to increase the standard errors of population estimates.
In the current study, therefore, all findings generated through
secondary analysis of PISA data for Canada and Australia have taken
account of the final student weights included in the datasets.
In this secondary analysis, we computed mean literacy performance
scores in reading and mathematics for students across various individual
and school SES backgrounds. To calculate an aggregated school SES for
each student, we averaged the ESCS scores associated with every student
who participated in PISA from a given school. We emphasise, however,
that in only a few cases (small and very small schools) did we have the
individual ESCS for every 15-year-old student in a given Canadian or
Australian school participating in PISA 2006. For the 356 schools that
comprised the 2006 Australian data, the size of the student group ranged
from a low of 3 students to a high of 58 students, and the distribution
of these 356 schools according to the size of the student group shows
that 26 (7%) of the school groups comprised fewer than 20 students.
Conversely, 330 schools (93%) comprised student groups of 20 or more,
with the average group being about 39 students. For Canada, the size of
the school groups ranged from a low of 1 to a high of 221 students; 254
(28%) of the school groups comprised fewer than 20 students. Conversely,
642 schools (72%) comprised student groups of 20 or more, and of these,
7 schools had groups of more than 100. The average group in the Canadian
dataset comprised about 25 students. Following the OECD's example,
we did not exclude very large or very small school groups from our
secondary analysis. First, the choice of a cut-point above or below
which to exclude seemed arbitrary; and second, the exclusion of large or
small groups did not substantially change the statistics associated with
the distribution of school group SES. For example, using the Canadian
data with all schools included, the mean school SES equalled 0.26 with a
standard error of 0.014. When very small school groups (<10 students)
and large school groups (>100 students) are excluded, mean school SES
equalled 0.29 with a standard error of 0.014.
The approach we used for this secondary analysis is similar to that
used to compare the effectiveness of private and public schooling across
student SES groups in the US and Chile (Lubienski & Lubienski, 2005;
Matear, 2006) and to examine the association between school SES and
performance in Australia (Perry & McConney, 2010a, 2010b).
Initially, five subgroups of students were formed for Canada and
Australia separately, based on students' individual SES; each of
these subgroups was further subdivided into five parts based on the
average SES of the school group to which they belonged. In this way, we
compared the literacy performance of high SES students across five bands
(quintiles) of schools representing low to high mean school SES. We
repeated this procedure for students with high-middle, middle,
low-middle and low individual SES backgrounds. In total, we calculated
25 means representing literacy in reading and mathematics, for Australia
and Canada, respectively. As shown in Tables 1 and 2, the smallest
subgroup in our analysis comprised 93 students (low SES Australian
students attending high SES schools) and the largest group contained
1982 students (high SES Canadian students attending high SES schools).
In summary, our purpose in this paper is to unpack previously
demonstrated relationships among student and school SES and
students' literacy performance to better describe and understand
how each varied in the context of variations in the other, and to
compare these variations across two similar countries, namely Canada and
Australia. In other words, our research questions in this secondary
analysis are primarily descriptive (e.g., what does the patterning of
literacy performance in reading and mathematics look like across varying
levels of individual and school SES, for Canada and Australia?). Thus,
our approach is also descriptive, by providing tabular and graphical
descriptions of how student performance varies as measured by PISA, in
the context of differing levels of individual student and school SES. We
believe that such descriptions are accessible and meaningful to
practitioners, policy makers and researchers--and hence add value to the
primary analyses already done (OECD, 2004, 2007a, 2007b). We believe
that our methods represent a powerful and broadly accessible approach to
understanding at a finer grain the interrelationships between individual
and school-level SES and their association with academic performance for
15-year-old students in Canada and Australia.
Findings
Our primary purpose in this secondary analysis is to examine the
associations between individual student SES, school group SES and
students' performance in reading and mathematics, as measured in
PISA 2006. Tables 1 and 2 show the interplay of these associations when
both student SES and school SES are disaggregated. Organised by subject,
the two tables reveal consistent patterns of improved literacy
performance associated with increases in student and school SES, for
both Canada and Australia.
For Canadian students, the reading literacy difference between low
and high SES students (1st and 5th student SES quintiles), both
attending high SES schools, is 69.9 score points (0.76 SD), on average.
Similarly, the observed reading literacy gap between low and high SES
Canadian students both attending middle SES schools is 52.3 points (0.57
SD), and the difference on average between students with low and high
SES backgrounds, both attending low SES schools, is 63.4 (0.69 SD). By
comparison, for Australian 15-year-olds, the equivalent gaps between low
and high SES students are 48.4 (0.54 SD) with both attending high SES
schools; 54.6 (0.61 SD) with both attending middle SES schools and 54.2
(0.61 SD) with both attending low SES schools, respectively.
As detailed in Table 2, the situation is much the same for
mathematics literacy in the two countries. On average, for example, the
mathematics literacy difference between low and high SES Canadian
students (1st and 5th student SES quintiles), both attending high SES
schools, is 58.3 points (0.72 SD), on average. Similarly, the gap in
mathematics literacy between low and high SES Canadian students both
attending middle SES schools is 47.8 points (0.59 SD), and the
difference on average between Canadian students with low and high SES
backgrounds, both attending low SES schools, is 61.6 (0.76 SD). By
comparison for Australian 15-year-olds, the equivalent gaps between low
and high SES students are 36.9 (0.44 SD) with both attending high SES
schools; 50.6 (0.60 SD) with both attending middle SES schools and 54.2
(0.64 SD) with both attending low SES schools, respectively.
Most importantly for this secondary analysis, school-group SES also
plays a non-trivial role in literacy performance for both countries. In
Canada, for example, the average student with a low SES background and
attending a low SES school lags his/her typical peer attending a high
SES school by 51.0 points (0.55 SD) in reading. Similarly, the typical
student with a high SES background and attending a low SES school lags
his/her typical peer attending a high SES school by 57.5 score points
(0.71 SD). For Australian 15-yearolds, equivalent comparisons show that
for students with low SES backgrounds, the average gap in reading
literacy associated with attending low versus high SES schools is 77.0
points (0.88 SD); and, for students with high SES backgrounds, the
average gap is 71.2 (0.90 SD).
As in the case of reading, school-group SES also plays a
significant role in mathematics literacy for both countries. In Canada
for example, as shown in Table 2, the typical student with a low SES
background and attending a low SES school lags his/her typical peer
attending a high SES school by 37.3 points (0.46 SD). Similarly, the
typical student with a high SES background and attending a low SES
school lags his/her typical peer in mathematics literacy attending a
high SES school by 33.9 points (0.44 SD). For Australian 15-year-olds,
equivalent comparisons show that for students with low SES backgrounds,
the average gap in mathematics associated with attending low versus high
SES schools is 78.7 (1.01 SD); and, for students with high SES
backgrounds, the average gap in mathematics literacy performance is 61.4
points (0.80 SD).
For the two countries, the gaps in reading literacy associated with
differences between high and low SES school groups are portrayed in
Figure 1. This comparative depiction shows that although both countries
experience differences in reading literacy associated with the SES of
the school--across the entire range of individual student SES--the gaps
evident at the two ends of the student-level SES continuum are
considerably more pronounced in Australia than in Canada. Specifically,
while low SES students in Canada experience a reading literacy gap
between low and high SES school groups of 0.55 SD, the equivalent
difference is 0.88 SD for Australia. Similarly, while high-middle SES
students in Canada experience a reading literacy gap between low and
high SES school groups of 0.46 SD, the equivalent difference is 0.76 SD
for Australia. If one has either a low or a mid to high SES family
background, where one attends school (in terms of the school's
aggregated SES) is more important in Australia than it is in Canada.
As with reading literacy for the two countries, the average
differences in mathematics literacy associated with differences between
high and low SES school groups are also portrayed in Figure 1. This
portrayal shows that although both countries experience differences in
mathematics literacy associated with the SES of the school--across the
entire range of individual student SES--the differences at the ends of
the student-level SES continuum are again more evident in Australia than
in Canada. Specifically, while low SES students in Canada experience a
mathematics literacy gap between low and high SES schools groups of 0.46
SD, the equivalent difference is fully 1.0l SD for Australia. Similarly,
while high SES students in Canada experience a reading literacy gap
between low and high SES schools groups of 0.44 SD, the equivalent
difference is 0.80 SD in Australia. Similar to the case for reading, but
more dramatically in mathematics, if one has either a low or a high SES
family background, the aggregated SES of the school one attends appears
considerably more important in Australia than in Canada.
Discussion
Our secondary analysis of the relationships between school SES and
academic achievement of 15-year-old students in Canada and Australia
found the following:
(1) The relationship between school SES and academic achievement is
evident for all students in both countries; regardless of their own
individual SES, students' academic performance in reading and
mathematics improves as the SES of the school group increases;
(2) The relationship between school SES and academic achievement is
generally weaker in Canada than in Australia; in other words, where one
goes to school typically matters less in Canada than in Australia;
(3) Nevertheless, the relationship between school SES and academic
performance is strong in both countries, with differences between the
lowest and highest school SES contexts ranging between 0.3 and 1.0
standard deviation units;
(4) In Canada, increases in school SES are associated with
increases in student achievement that are relatively consistent in size;
in Australia, however, increases in achievement associated with
increased school SES are considerably more pronounced between middle and
high SES schools than between low and middle SES schools. When plotted
(see Figures 2 and 3), relationships between school SES and student
achievement in Australia look like the end of an ice hockey stick. In
Australia, high SES schools seem able to provide a much stronger
performance advantage than other schools. Compared to their counterparts
in Australia, high SES schools in Canada do not appear to possess such a
relative advantage, except in the case of mathematics performance for
high SES students.
As these findings indicate, where one goes to school (in terms of
the collective or average SES of the school) seems less important in
Canada than in Australia. Achievement gaps between low and high SES
students and schools are generally smaller in Canada than in Australia.
And attending a high SES school does not provide as much of an
educational advantage in Canada as it seems to do in Australia. Figure 2
plots the average mathematics literacy performance for the lowest and
highest SES student quintiles, across five school SES contexts for each
country, and Figure 3 reproduces this for reading. As can be seen in
both figures, the lines representing average student performance tend to
be flatter for Canada than for Australia, especially for low SES
students. Moreover, the slopes of the lines representing mathematics and
reading performance in Australia become particularly steep between
middle and high SES school contexts. This is what we refer to as the
(ice) "hockey stick" effect.
Compared to Canada, the Australian system appears to be more suited
to reproducing educational advantage rather than ameliorating it. The
more equitable nature of the Canadian system, however, is not associated
with lower quality. For example, the PISA 2006 report (OECD, 2007b)
shows that both countries have essentially the same proportion of
students who achieve within the two highest proficiency bands (14.6% for
Australia and 14.4% for Canada; the OECD average is 9.0%). Moreover,
educational equity in Canada does not come at the expense of privileged
students: high SES students perform the same in both countries, while
low SES students generally perform higher in Canada than in Australia
(Perry & McConney, 2011).
[FIGURE 2 OMITTED]
While our study does not provide direct evidence that explains
these differences between the two systems of education, we offer the
following possible explanations. First, the cross-national differences
are unlikely to be due to differences in the student cohorts. The
student-level ESCS indices are slightly higher in Canada than in
Australia, but the ESCS range is similar for the two countries. Student
variability in the distribution of PISA's ECSC is very similar and
the inter-quartile range of the distribution is practically the
same--1.12 in Australia and 1.13 in Canada, compared to the OECD average
of 1.28 (OECD, 2007b). Similarly, the range of school SES values is
comparable between the two countries (OECD, 2007b). In other words, the
overall ESCS values for students and schools are comparable between the
countries. Second, it is unlikely that the Canadian education system has
lower achievement gaps than Australia because the former has more
effective teachers or principals. There is nothing in the research
literature that would suggest qualitative differences in the training,
quality or effectiveness of practitioners across the two countries. This
would remain a possibility, however, and would certainly be worthy of
future study.
[FIGURE 3 OMITTED]
Rather than qualitative differences between students or
practitioners, it is more likely that our findings are reflective of
differences in the ways in which students are sorted across schools, and
the resources that are available to students across different school
contexts. School socioeconomic segregation is much less pronounced in
Canada than in Australia. Approximately 60% of students attend a
socially mixed school in Canada, a proportion which is second only to
Finland and Norway (OECD, 2010). By contrast, only 35% of students in
Australia attend a socially mixed school, one of the smallest
proportions among OECD countries. Likewise, approximately 55% of
advantaged students attend a socially advantaged school in Australia,
compared to 40% in Canada. This higher level of Australian school
segregation is accompanied by PISA analyses that show that advantaged
schools in Australia are more likely to have better educational
resources than other schools, and that this correlation is moderately
strong (0.31) and statistically significant compared to the OECD average
(OECD, 2010). This correlation is uncommon among OECD countries; indeed,
only three (Australia, Chile and Mexico) of the 34 participating OECD
countries showed such a correlation between advantaged schools and
superior resources.
Conclusion
The relationship between school SES and student outcomes is
generally stronger in Australia than in Canada. An important and visible
difference between the Australian and Canadian educational systems is
the degree to which they are marked by school choice, privatisation and
social segregation. In Australia, these features of educational
marketisation have provided unequal access to resources and
"good" schools and have led to levels of social exclusion and
segregation higher than in comparable, highly developed countries such
as Canada. Our findings build on previous theoretical and empirical
research suggesting that where one goes to school matters a great deal
in education systems that have high levels of social segregation and
differential resourcing. Our findings also suggest that such systems
foster an educational "Matthew effect" that increases rather
than decreases achievement gaps between advantaged and disadvantaged students and schools.
Declaration of conflicting interests
None declared.
Funding
This study was supported by a grant from the Australian Research
Council's Discovery Projects funding scheme (project number
1097057) awarded to Laura Perry.
DOI: 10.1177/0004944113485836
References
Alegre, M. A., & Ferrer, G. (2010). School regimes and
education equity: Some insights based on PISA 2006. British Educational
Research Journal, 36(3), 433-461.
Australian Bureau of Statistics. (2006). Australian social trends,
2006. Canberra, Australia: Author.
Blossfeld, H.-P., & Shavit, Y. (1993). Persisting barriers:
Changes in educational opportunities in thirteen countries. In Y.
Shavit, & H.-P. Blossfeld (Eds.), Persistent inequality (pp. 1-24).
Boulder, CO: Westview.
CIA. (2011, August 28). CIA world factbook 2011. Retrieved from
https://www.cia.gov/library/publications/the-world-factbook/index.html
Coleman, J., Campbell, E., Hobson, C., McPartland, J., Mood, A.,
Weinfeld, F., ... York, R. (1966). Equality of educational opportunity.
Washington, DC: US Government Printing Office.
Connor, H., & Dewson, S. (2001). Social class and higher
education: Issues affecting decisions on participation by lower social
class groups. London, UK: HMSO.
Dronkers, J., & Levels, M. (2006). Social-economic and ethnic
school-segregation in Europe and Australia and educational achievement
of migrant-pupils coming from various regions of origins. Paper
presented at the meeting of the ISA Research Committee on Social
Stratification and Mobility "Intergenerational Transmissions:
Cultural, Economic or Social Resources?" Nijmegen, Netherlands.
Kahlenberg, R. (2001). All together now: Creating middle-class
schools through public school choices. Washington, DC: Brookings
Institution.
Lee, J. B. (1999). How do students and families pay for college? In
J. E. King (Ed.), Financing a college education." How it works, how
it's changing (pp. 9-27). Phoenix, AZ: American Council on
Education.
Lubienski, S. T., & Lubienski, C. (2005). A new look at public
and private schools: Student background and mathematics achievement. Phi
Delta Kappan, 86(9), 696-699.
Matear, A. (2006). Equity in education in Chile: The tensions
between policy and practice. International Journal of Educational
Development, 27(1), 101-113.
McConney, A., & Perry, L. B. (2010). Science and mathematics
achievement in Australia: The role of school socioeconomic composition
in educational equity and effectiveness. International Journal of
Science and Mathematics Education, 8(3), 429-452.
McGaw, B. (2010). Banksia association annual lecture. Perth,
Australia: Murdoch University.
McPherson, A., & Willms, J. D. (1987). Equalisation and
improvement: Some effects of comprehensive reorganisation in Scotland.
Sociology, 21, 509-539.
Orfield, G., & Yun, J. T. (1999). Resegregation in American
schools. Cambridge: The Civil Rights Project, Harvard University.
Organisation for Economic Co-operation and Development [OECD].
(2004). Learning for tomorrow's world. First results from PISA
2003. Paris, France: Author.
Organisation for Economic Co-operation and Development [OECD].
(2005). School factors related to quality and equity: Results from PISA
2000. Paris, France: Author.
Organisation for Economic Co-operation and Development [OECD].
(2007a). PISA 2006: Science competencies for tomorrow's world, vol.
1. Paris, France: Author.
Organisation for Economic Co-operation and Development [OECD].
(2007b). PISA 2006. Science competencies for tomorrow's world, vol.
2. Paris, France: Author.
Organisation for Economic Co-operation and Development [OECD].
(2009). PISA 2006: Technical report. Paris, France: Author.
Organisation for Economic Co-operation and Development [OECD].
(2010). PISA 2009 results: Overcoming social background- Equity in
learning opportunities and outcomes. Paris, France: Author.
Palardy, G. J. (2008). Differential school effects among low,
middle, and high social class composition schools: A multiple group,
multilevel latent growth curve analysis. School Effectiveness and School
Improvement, 19, 21-49.
Perie, M., Moran, R., & Lutkus, A. D. (2005). NAEP 2004 trends
in academic progress. Three decades of student performance in reading
and mathematics. Washington, DC: US Department of Education.
Perry, L. B., & McConney, A. (2010a). Does the SES of the
school matter? An examination of socioeconomic status and student
achievement using PISA 2003. Teachers College Record, 112(4), 1137-1162.
Perry, L. B., & McConney, A. (2010b). School socio-economic
composition and student outcomes in Australia: Implications for
education policy. Australian Journal of Education, 54(1), 72-85.
Perry, L. B., & McConney, A. (2011, November). Achievement gaps
by student and school socioeconomic status: A comparison of Australia
and Canada. Paper presented at the annual conference of the Australian
Association for Research in Education, Hobart, Australia.
Phillips, S., Raham, H., & Wagner, K. (2004). School choice:
Politices and effects. Kelowna, Canada: Society for the Advancement of
Excellence in Education.
Renzulli, J. S., & Park, S. (2000). Gifted dropouts: The who
and the why. Gifted Child Quarterly, 44, 261-271.
Robertson, D., & Symons, J. (2003). Do peer groups matter? Peer
group versus schooling effects on academic attainment. Economica,
70(277), 31-53.
Rumberger, R. W., & Palardy, G. J. (2005). Does segregation
still matter? The impact of student composition on academic achievement
in high school. Teachers College Record, 107(9), 1999-2045.
Ryan, C., & Watson, L. (2004). The drift to private schools in
Australia: Understanding its features. Canberra: Centre for Economic
Policy Research, Australian National University.
Sirin, S. R. (2005). Socioeconomic status and academic achievement:
A meta-analytic review of research. Review of Educational Research,
75(3), 417-453.
Southworth, S. (2010). Examining the effects of school composition
on North Carolina student achievement over time. Education Policy
Analysis Archives, 18(29), 50-90.
Sui-Chu, E. H., & Willms, J. D. (1996). Effects of parental
involvement on eighth-grade achievement. Sociology of Education, 69(2),
126-141.
Terenzini, P. T., Cabrera, A. F., & Bernal, E. M. (2001).
Swimming against the tide: The poor in higher education. New York, NY:
College Board.
Whitty, G., & Power, S. (2000). Marketization and privatization in mass education systems. International Journal of Educational
Development, 20, 93-107.
Willms, J. D. (1999). Quality and inequality in children's
literacy: The effects on families, schools, and communities. In D. P.
Keating, & C. Hertzman (Eds.), Developmental health and the wealth
of nations. Social, biological, and educational dynamics. New York, NY:
Guilford Press.
Willms, J. D. (2010). School composition and contextual effects on
student outcomes. Teachers College Record, 112(4), 1008-1037.
Zimmer, R. W., & Toma, E. F. (2000). Peer effects in private
and public schools across countries. Journal of Policy Analysis and
Management, 19(1), 75-92.
Laura B Perry
Senior Lecturer, Contexts of Education, School of Education,
Murdoch University, Australia
Andrew McConney
Senior Lecturer, Program Evaluation, Research Methods and Classroom
Assessment, School of Education, Murdoch University, Australia
Corresponding author:
Laura B Perry, School of Education, Murdoch University, 90 South
St, Murdoch WA 6150, Australia.
Email: L.Perry@murdoch.edu.au
Table 1. Reading literacy means by student SES and school group SES
for Canada and Australia as measured by PISA 2006.
School group SES
Student SES
(ESCS) 1st quintile 2nd quintile 3rd quintile
Reading literacy: PISA 2006--Canada
1st quintile n=1825 n=1130 n=807
460.8 478.3 497.9
2nd quintile n=1093 n=1148 n=995
495.8 498.8 515.5
3rd quintile n=789 n=985 n=973
488.1 513.2 523.4
4th quintile n=483 n=733 n=938
519.9 528.2 535.7
5th quintile n=245 n=463 n=670
524.2 537.5 550.2
All quintiles n=4435 n=4459 n=4383
484.2 505.6 523.6
Reading literacy: PISA 2006--Australia
1st quintile n=1158 n=792 n=505
458.3 463.7 472.0
2nd quintile n=734 n=697 n=642
482.1 489.0 497.1
3rd quintile n=452 n=609 n=657
491.7 499.6 510.1
4th quintile n=287 n=437 n=569
499.4 503.3 525.8
5th quintile n= 151 n=267 n=414
512.5 528.9 526.6
All quintiles n=2782 n=2802 n=2787
477.19 490.18 505.86
School group SES
Student SES
(ESCS) 4th quintile 5th quintile All quintiles
Reading literacy: PISA 2006--Canada
1st quintile n=518 n=158 n=4438
504.5 511.8 478.9
2nd quintile n=763 n=422 n=4421
517.5 542.3 509.2
3rd quintile n=1024 n=683 n=4454
535.9 552.7 522.3
4th quintile n=1096 n=1149 n=4399
545.2 559.6 541.3
5th quintile n=1064 n=1982 n=4424
561.3 581.7 564.2
All quintiles n=4465 n=4394 n=22136
537.4 565.1 523.1
Reading literacy: PISA 2006--Australia
1st quintile n=252 n=93 n=2800
494.4 535.3 468.10
2nd quintile n=492 n=234 n=2799
511.9 523.5 495.96
3rd quintile n=678 n=427 n=2823
526.6 546.4 514.34
4th quintile n=737 n=757 n=2787
536.6 560.9 531.94
5th quintile n=679 n=1275 n=2786
549.4 583.7 557.74
All quintiles n=2838 n=2786 n=13995
529.24 565.12 513.56
SES: socioeconomic status; PISA: Programme for International
Student Assessment.
Table 2. Mathematics literacy means by student SES and school
group SES for Canada and Australia as measured by PISA 2006.
School group SES
Student SES (ESCS) 1st quintile 2nd quintile 3rd quintile
Mathematics literacy: PISA 2006--Canada
1st quintile n=1825 n=1130 n=807
474.5 490.9 498.2
2nd quintile n=1093 n=1148 n=995
505.7 509.5 513.0
3rd quintile n=789 n=985 n=973
490.8 522.8 518.5
4th quintile n=483 n=733 n=938
524.9 535.0 529.0
5th quintile n=245 n=463 n=670
536.1 546.5 546.0
All quintiles n=4435 n=4459 n=4383
494 515.8 520
Mathematics literacy: PISA 2006--Australia
1st quintile n=1158 n=792 n=505
472.5 475.6 481.1
2nd quintile n=734 n=697 n=642
489.5 492.2 500.3
3rd quintile n=452 n=609 n=657
498.4 504.3 515.3
4th quintile n=287 n=437 n=569
506.9 510.1 532.2
5th quintile n=151 n=267 n=414
526.7 531.1 531.7
All quintiles n=2782 n=2802 n=2787
487.7 496.6 511.5
School group SES
Student SES (ESCS) 4th quintile 5th quintile All quintiles
Mathematics literacy: PISA 2006--Canada
1st quintile n=518 n=158 n=4438
513.9 511.8 488.9
2nd quintile n=763 n=422 n=4421
520.7 531.5 513.4
3rd quintile n=1024 n=683 n=4454
532.6 537.6 520.7
4th quintile n=1096 n=1149 n=4399
545.1 548.4 538.6
5th quintile n=1064 n=1982 n=4424
560.8 570.0 559.8
All quintiles n=4465 n=4394 N=22136
538.2 553.5 524.3
Mathematics literacy: PISA 2006--Australia
1st quintile n=252 n=93 n=2800
500.1 551.2 480.0
2nd quintile n=492 n=234 n=2799
520.0 535.1 501.8
3rd quintile n=678 n=427 n=2823
531.8 555.9 520.3
4th quintile n=737 n=757 n=2787
539.5 568.4 537.9
5th quintile n=679 n=1275 n=2786
554.9 588.0 562.8
All quintiles n=2838 n=2786 N=13995
534.5 572.1 520.5
SES: socioeconomic status; PISA: Programme for International
Student Assessment.
Figure 1. Gaps in performance in reading and mathematics
between students in low socioeconomic status (SES) and
high SES school groups, expressed in standard deviation
units, across student SES quintiles for Australia and Canada.
1st Student 2nd Student 3rd Student
Quintile Quintile Quintile
Canada-Reading 0.55 0.53 0.76
Australia-Reading 0.86 0.48 0.66
Canada-Mathematics 0.46 0.34 0.61
Australia Mathematics 0.01 0.58 0.73
4th Student 5th Student
Quintile Quintile
Canada-Reading 0.46 0.71
Australia-Reading 0.76 0.90
Canada-Mathematics 0.30 0.44
Australia-Mathematics 0.78 0.80