首页    期刊浏览 2025年12月03日 星期三
登录注册

文章基本信息

  • 标题:Survey delves into educators' use of assessment data.
  • 作者:Pritz, Sandra ; Kelley, Patricia
  • 期刊名称:Techniques
  • 印刷版ISSN:1527-1803
  • 出版年度:2009
  • 期号:November
  • 出版社:Association for Career and Technical Education

Survey delves into educators' use of assessment data.


Pritz, Sandra ; Kelley, Patricia


[ILLUSTRATION OMITTED]

The term "data-driven decision making" has become ubiquitous in education, and yet it seems to be most often discussed with reference to policy decisions related to reporting requirements and accountability. What deserves at least equal attention is what would enable teachers and administrators to use student assessment results to learn about students' skills and about the effectiveness of instruction and then to apply that learning to instructional improvements. As a partner in the National Research Center for Career and Technical Education (NRCCTE) funded by the U.S. Department of Education, Office of Adult and Vocational Education (OVAE), NOCTI determined to give the issue attention by conducting a research survey.

NOCTI has a long history, well over four decades, of providing comprehensive occupational skill assessments to measure student achievement. One of the organization's core beliefs is that comprehensive career and technical education (CTE) student assessment is one of the most effective tools for increasing student achievement and motivation through educators' use of data for instructional purposes. This study investigated how secondary CTE educators, both teachers and administrators, use technical assessment data to improve program curriculum and to identify individual/group instruction needs. Of particular interest in developing the survey was to ask how they have learned to make use of the data; what specific types of professional development were provided, if any; their perception of the effectiveness of this training; and what types of professional development they would consider most effective for the future.

[FIGURE 1 OMITTED]

[FIGURE 2 OMITTED]

The study design involved e-mailing the survey to a purposeful sample of CTE educators in five states (Illinois, Missouri, Oklahoma, Pennsylvania and Virginia). The survey was distributed thanks to collaboration with the five state directors of CTE, through the directors of a random sample of the secondary CTE centers and the career tech coordinator of a random sample of comprehensive high schools, and to all the CTE instructors in the same four selected occupational clusters (Business Education, Construction, Health Science, and Manufacturing). The selection of four programs was made so that the data for those programs could be compared across sites.

The criteria for selection were intended to result in a set of programs as representative as possible of the whole of CTE, and as non-duplicative as possible of variables such as capital intensity, program outcome with regard to certification, student gender preponderance, cluster, and use of technology. The overriding criterion was that they be commonly enough found among schools that the set would have a high probability of providing a large number of schools for the sampling frame.

In order to distance itself from the survey administration, NOCTI subcontracted that process to the Pennsylvania State University Survey Research Center (PSUSRC). The value of having the survey administered by an external and impartial organization was weighed against the loss of time working with another entity. In addition to the surveys, a variety of small case study interviews were conducted with individual schools in the five states that were known to be using assessment data to make instructional improvements. The information gleaned from these case studies will be included in the project final report, which will be available after OVAE clearance.

Data Analyses and Findings

The data set was reviewed from a descriptive standpoint (e.g., response frequencies), the hypotheses have been examined as much as possible, and several potential trends related to the study hypotheses could be seen in the data set. Of the respondents, the vast majority indicated that their CTE students took end-of-program technical assessments. The results of these assessments were used for a variety of purposes, the most common of which was maintaining a continuous improvement process, making improvements to programs in areas in which scores were weak, and reporting to outside bodies.

A primary objective of this study was to investigate the extent to which CTE educators used technical assessment data to inform instructional improvement and the sources of their knowledge that enables them to do so. A majority of the responders indicated that they do use technical assessment data to make instructional decisions. Of those who did not use the data to make improvements, most expressed a belief that they should be using them. Figures 1 and 2 show some of the most common instructional changes educators made based on data.

When asked how they learned to use data to make instructional decisions, the most common mode among administrators was during teacher or administrative training, followed by self-taught. For teachers, the most common method was self-taught, followed by teacher training and professional development. A second objective of this study was to examine the types of professional development CTE educators have received related to the use of data and how those offerings were perceived. Among the administrators who responded, 31.3 percent had not received such professional development; among teachers, 35.9 percent had not received professional development. Of those who had received such professional development, most seemed to feel that the training contained information they needed at an appropriate level. A third objective of this study was to determine how educators rated the types of professional development and what forms of professional development on use of assessment data they felt would be most useful. The results were mixed, but they did seem to indicate that a mixture of formal training and practical follow-up would be the most helpful. Respondents were also asked about areas for which they did not have professional development but wished it were available. Figure 1 shows the most frequently cited topics.

Respondents were also asked whom they would prefer as a delivery agent for professional development in the use of assessment data for data-based decision making. The top three choices for teachers were a knowledgeable teacher or peer; a consultant; and a representative from a professional testing organization. The top four choices for administrators were a school, district, or state data specialist; a knowledgeable teacher or peer; a consultant; and a representative from a professional testing organization.

The study posed several research hypotheses, the first of which was that educators who know more about test data interpretation will tend to use the data for instructional improvements than those who know less. As indicated above, many respondents indicated that they do use data for the purpose of making instructional improvements, although many administrators also indicated that they felt their teachers had a need for additional training on the use of assessment data for data-based decision making. Figure 1: Desired Professional Development Topics * Meaning of technical terms used on test report * How to interpret test data * Flow to measure improvement over time * Flow good tests are developed * How to select the best test for a curriculum * Appropriate and inappropriate uses of test data * What questions test data can and cannot answer

Overall, both teachers and administrators indicated that they had a positive perception of the value of technical skill assessments, although administrators had a slightly more positive view than did teachers. A second research hypothesis in the study was that those who use test data for program improvement perceive an impact from the changes. The descriptive findings show qualitative support for this hypothesis. Some of the types of instructional changes educators made are discussed above. When asked if they found those changes to be effective, the majority of respondents indicated that they did see them as effective. The rest of those who responded to the question indicated that they were unsure.

A third research hypothesis was that those who use test data for program improvement have had professional development on the topic. Various responses to several questions in the descriptive findings indicate that educators see the value of using data to make instructional improvements, and that professional development can be helpful in learning how to interpret and apply data most effectively.

Next Steps

In addition to the objectives and research hypotheses discussed above, the survey also solicited information on a variety of other factors that were determined to be potentially helpful in addressing a fourth objective to be completed next year--that of creating a professional development system geared toward improving educator understanding of the use of technical assessment data, and increasing the use and effectiveness of such data in making instructional improvements. Findings from the survey will be combined with insights gleaned from the literature to provide a basis for the creation of professional development that can be pilot-tested and later provided in response to technical assistance requests from states.

Significance

U.S. Secretary of Education Arne Duncan, in testimony before the House Education and Labor Committee, said, "... states must build data systems that can track student performance from one year to the next, from one school to another, so that those students and their parents know when they are making progress and when they need extra attention. This information must also be put in the hands of educators so they can use it to improve instruction" (Ed.gov, 2009). This study

was especially timely because this is the first time that, through the 2006 reauthorization of the Carl D. Perkins Career and Technical Education Act (Perkins IV), states must report on the technical skill attainment of student concentrators in CTE programs. It is time to help educators learn from data so that they can improve student performance.

Sandra Pritz, Ph.D., is a senior consultant to NOCTI. She can be reached at necti@nocti.org

Patricia Kelley, Ph.D., manages the Assessment Division of NOCTI and The Whitener Group. She can be reached at nocti@nocti.org.
联系我们|关于我们|网站声明
国家哲学社会科学文献中心版权所有