首页    期刊浏览 2024年11月15日 星期五
登录注册

文章基本信息

  • 标题:To investigate or not to investigate? The use of content specific open-ended tasks.
  • 作者:White, Paul ; Sullivan, Peter ; Warren, Elizabeth
  • 期刊名称:Australian Mathematics Teacher
  • 印刷版ISSN:0045-0685
  • 出版年度:2016
  • 期号:September
  • 语种:English
  • 出版社:The Australian Association of Mathematics Teachers, Inc.
  • 摘要:Some teachers are concerned that a problem solving approach to teaching may reduce attention to the key concepts and procedures of mathematics. The polarisation of positions concerning problem solving and investigations versus the notion of a secondary mathematics teacher 'as an expositor and director of learning' (Allen, 1998, p.3) is illustrated by the debate raging in the US tagged the 'Math Wars'. A similar situation has arisen in New South Wales. The Stage 5 mathematics syllabus introduced in 1997 contained a whole strand on mathematical investigations. However, due to some strong opposition claiming such investigations take students away from content focused mathematics, this section was temporarily made optional and is currently under review.
  • 关键词:Education, Secondary;Mathematics;Mathematics education;Mathematics teachers;Secondary education

To investigate or not to investigate? The use of content specific open-ended tasks.


White, Paul ; Sullivan, Peter ; Warren, Elizabeth 等


[ILLUSTRATION OMITTED]

Some teachers are concerned that a problem solving approach to teaching may reduce attention to the key concepts and procedures of mathematics. The polarisation of positions concerning problem solving and investigations versus the notion of a secondary mathematics teacher 'as an expositor and director of learning' (Allen, 1998, p.3) is illustrated by the debate raging in the US tagged the 'Math Wars'. A similar situation has arisen in New South Wales. The Stage 5 mathematics syllabus introduced in 1997 contained a whole strand on mathematical investigations. However, due to some strong opposition claiming such investigations take students away from content focused mathematics, this section was temporarily made optional and is currently under review.

An approach which appears to contain some of the desirable attributes of investigations has received recent research interests involves the use of content specific open-ended tasks (e.g. Sullivan, Warren, White, & Suwarsono, 1998) as opposed to the standard closed task. An example of a closed task is:

Find the mean of 8, 10, 12, 12 and 18.

The corresponding content specific open-ended task would be:

The mean of a set of 5 scores is 12.

What might be the scores?

Some other examples of content specific open-ended tasks are:

* A number has been rounded off to 5.6.

What might be the number?

* Write some numbers with exactly six factors.

* Write algebraic expressions which might be multiplied to give 4[a.sup.2].

* Find rectangles with the same area but different perimeters.

The efficacy of such tasks is now considered by analysing students' responses to one pair of comparable closed and open-ended content specific tasks.

Comparing responses to open-ended and closed tasks

In some earlier trialling of open-ended tasks, many students regularly responded with only one answer even though multiple answers were available.

These single responses made comparisons between closed and open-ended tasks difficult. As a result, in the tasks that follow, a prompt for at least three answers was given for all open-ended tasks.

The following area and perimeter task was administered to just over 1000 Grade 7 students from 23 schools across New South Wales, Queensland and Victoria in a both a closed and open form. As well, a generalised form of the task was included to allow students to express mathematical ideas in their own words.

Closed version

Joan designs a garden 36 [m.sup.2] in area and 3m wide.

A = 36 [m.sup.2]

3m

What is its perimeter?

Open version

Joan wants a garden to be rectangular with a perimeter of 30 m. What might be the area of the garden? (Give at least 3 answers.)

Generalised version

Explain to Joan how to work out some more of these designs for herself.

Results

Results for each of the three versions are presented separately. Implications for teaching and assessment are considered later.

Closed version

For the closed version, responses were scored as no attempt, incorrect or correct. Tables 1 and 2 show the breakdown and most common incorrect responses. Responses were scored as correct if they showed 30, regardless of units'.

The results show that:

* there were a surprisingly large number of incorrect responses;

* the common errors (19.8% of all responses) account for only about 40% of the incorrect responses (28.6% of all responses);

* the missing dimension of 12 is readily found;

* even after being found, using the missing dimension of 12 to find the perimeter is problematic.

Open version

Each of the open-ended items were scored as no attempt or using the following codes:

i 3 correct responses ii 1 or 2 correct responses (no errors) iii Some correct and some incorrect responses iv 1 or 2 errors (no correct responses) v 3 errors

Numbers (ii) and (iv) occurred when only one or two responses were given. Tables 3 and 4 show the breakdown of responses and the most common errors.

The results show that:

* 40% (17.3 + 6.6 + 16.2) gave at least one correct response--less than half of these gave 3 correct responses and 24% (17.3 + 6.6) gave only correct answers;

* 28% gave 3 incorrect responses;

* perhaps surprisingly, 16% gave both correct and incorrect responses.

Three incorrect responses usually involved 3 responses based on a single misconception (e.g, 5 x 6, 3 x 10, 15 x 2 by not halving the perimeter and thus finding two numbers with a product of 30).

Two correct and one incorrect responses usually involved errors:

* in arithmetic (e.g. 9 x 6 = 58 [cm.sup.2]);

* which showed dimensions which did not add to 15 (e.g. 15 x 1);

* which repeated a rectangle and so did not give three separate answers (e.g. 10 x 5 and then 5 x 10).

One correct and two incorrect responses usually involved unrelated errors which showed no particular pattern.

The following provides a comparison of Tables 1 and 2.

* Giving 3 correct responses (17.3%) to one open-ended task as opposed to the correct response to a comparable closed version (28.6%) suggests that the open version is more cognitively demanding.

* Giving only correct responses (24%) to one open-ended task as opposed to the correct response to a comparable closed version (28.6%) suggests that correct answers were equally accessible in both versions.

* Giving some correct responses (40%) to one open-ended task as opposed to the correct response to a comparable closed version (28.6%) suggests that some form of correct response is more accessible in the open version.

The results suggest that the open versions provides a wider range of information about students' understanding than the comparable closed version.

Generalised version

Table 5 provides a breakdown of responses for the generalised version.

The results show that:

* 71.6% ( 58.0 + 13.6) either gave no response or non-mathematical comments such as 'Let her work it out for herself or 'I don't know and I don't care';

* 17% made an attempt but could not give a correct plan;

* the number giving a correct response is very low (even combining the 6.2% and the 5.0%).

Comparing Tables 2 and 3, we can assume that only the 17% who gave 3 correct responses to the open version would be in a position to generalise. Hence we have 3 classifications:

* Students who gave 3 correct responses but no generalisation (6%);

* Students who gave 3 correct responses and made some progress towards generalisation (5%);

* Students who gave 3 correct responses and a correct generalisation (6%).

In summary, more than 70% of these students did not actively engage in the exercise at all while only about one third of those who gave 3 correct responses successfully stated the generalisation. Together, these raise concern about the ability or indeed the willingness of students to engage in such activities.

Summary

To give some perspective to the results on closed and open-ended tasks imagine a typical class of 25 of these students attempting such tasks.

* 10 will give some form of correct response to the open version (40%);

** 6 of these 10 will have no errors (24%);

** 4 of these 10 will give 3 correct responses (17%);

* 7 will give a correct response to the closed version (28.6%);

* 5 will make no attempt at any of the tasks;

* 1 will offer no response to the open version, but will give the correct response to the closed version;

* between 1 and 2 will write a correct generalisation.

Implications

We believe that open-ended content specific tasks make a useful contribution to a mathematics curriculum in that they can:

* address conventional content explicitly and so can be easily integrated into mathematics curricula;

* have a teaching focus sufficiently similar to what teachers usually do and so are easy to implement;

* assist learning by drawing students' attention to key aspects of concepts, and make it clear to students that they can think about mathematical relationships;

* allow the possibility of the students investigating the situation for themselves and so coming to a better appreciation of the concept as a result of their own thinking;

* provide opportunities for discussion of and reflection on responses which could provide a rich, student based learning environment;

* provide assessment information on what students can do as well as cannot do.

With respect to assessment, open-ended tasks are useful in that:

* multiple incorrect responses provide insights into misconceptions;

* a mixture of correct and incorrect responses provides insight into what is known and not known whereas a closed task may have only shown the latter;

* the cycle of:

'closed [right arrow] 3 correct [right arrow] generalisation' provides a hierarchy for identifying different levels of achievement.

Conclusion

Content specific open-ended tasks allow problem solving to be incorporated into the normal curriculum. An added advantage is that these tasks provide insights into content understanding as well as problem solving ability. However, we are not advocating that these be the only type of tasks used. The diet of mathematics given to students may be compared with a balanced eating diet. Balance is achieved not just by deciding which is best, grains, fruit and vegetables, meat and fish, or dairy, and then eating only that food type. Rather, it involves using a selection to maximise the contribution of the parts. Hence, the ideal mathematics curriculum may indeed be a balance between various approaches to teaching mathematics. These results support the use of open-ended questions as a useful teaching strategy, but they also suggest that such questions form part of the mathematical diet and not be the sole serving.

Published in Vol. 56, No. 2, 2000, Barry Kissane (Ed.)

References

Allen. F. (1998). A program for raising the level of student achievement in secondary school mathematics. Retrieved from http://ourworld.compuserve.comlhomepagesl mathmanl allen.htm

Sullivan, P., Warren, E., White, P. & Suwarsono, S. (1998). Different forms of mathematical questions for different purposes: Comparing student responses to similar closed and open-ended questions. In C. Kanes, M.

Goos & E. Warren (Eds), Proceedings of the 21st annual conference of the mathematics education research group of Australasia (pp. 645-652). Gold Coast: MERGA.
Table 1. Percentage of all responses
to closed versions.

Item         % responses

No attempt   18.9
Incorrect    52.5
Correct      28.6
Total        100

Table 2. Common incorrect responses
to closed versions

Common incorrect responses    % responses

12 (only found side length)   14.0
15 (forgot to double)         2.4
3 x 12 or 3 x 4               3.4
Total                         19.8

Table 3. Percentage of all responses to open version.

Item                                 % responses

No attempt                           23.0
i     3 correct                      17.3
ii    1 or 2 correct (no errors)     6.6
iii   some correct, some incorrect   16.2
iv    1 or 2 errors (no correct)     9.3
v     3 errors                       27.6
Total                                100

Table 4. Most common errors for open version.

Item                        % responses

5 x 6, 3 x 10, 15 x 2 etc   7.6
(did no halve 30)
20 x 10, 15 x 15            4.0
  (dimensions add to 30)
Total                       100

Table 5. Sample percentage responses to
generalised version.

Item                                % responses

No attempt                          58.0
Incorrect plan                      8.6
Correct--e.g. halve 30 and then
find two numbers which add to 15    6.2
Correct but incomplete              5.0
Non-mathematical                    13.6
  (e.g. use her brain)
Only gave rules for perimeter       8.6
  and area
Total                               100
联系我们|关于我们|网站声明
国家哲学社会科学文献中心版权所有