首页    期刊浏览 2025年06月22日 星期日
登录注册

文章基本信息

  • 标题:Creating School Accountability Reports
  • 作者:Richard S. Brown
  • 期刊名称:School Administrator
  • 印刷版ISSN:0036-6439
  • 出版年度:1999
  • 卷号:Nov 1999
  • 出版社:American Association of School Administrators

Creating School Accountability Reports

Richard S. Brown

A researcher offers guidance on delivering what parents most want to know about their schools

School accountability report cards can serve a variety of purposes. They can inform the public. They allow district leadership to monitor progress toward goals and identify poorly performing schools for targeted intervention. They also give individual schools a chance to highlight accomplishments that might otherwise go unnoticed.

According to Education Week's 1999 report "Quality Counts," 36 states today produce school-level report cards. Many school districts in the other 14 states have created their own accountability reports, though these do not allow for statewide comparisons.

Twenty-six of the states that require school-level report cards make them available on the Web for easy access, but only 13 states require the report cards to be sent home to parents. Even fewer people apparently see these report cards, according to the Education Week report, which surveyed educators, taxpayers and parents.

Of educators, only 51 percent had even seen a school report card for their area. Only about 40 percent of the parents indicated they had seen a school-level report card. And for taxpayers it was only about 25 percent. Thus, a majority of people are not seeing the report cards even in areas where they are being produced. Clearly, the message is not getting out to a broad audience.

What's Presented?

Primarily, the school accountability report cards focus on student achievement, usually in the form of standardized test scores. All 36 states that require school report cards publish test scores as a part of their presentation. The second most common element is dropout rates, followed by graduation rates, post-graduation plans, advanced placement coursework and other course-taking activity.

But is this what people most want to see? Not really, according to a study of school accountability report cards by Beldon, Russonello and Stewart and Research/Strategy/Management, two public opinion research firms.

Their survey of parents, taxpayers and educators (counselors, principals and teachers) indicated that the single most important element they wanted to see addressed on a school-level report card is school safety. Right behind was teacher qualifications. These interests persisted across all three groups of respondents.

Those issues were followed, in order, by average class size, graduation rates and dropout rates. Only then did the survey respondents express interest in student performance data. What you conclude from such findings is that test score information is important, and the public expects it to be there. But it is not the most important need in the eyes of the user.

Another interesting finding was that student demographic data, like ethnic percentages and students with limited English proficiency, was considered by all groups to be the least important element of 21 pieces of information. This suggests that student demographics should be provided as a context and not as an explanation. It shouldn't be played up highly in the report card presentation.

Comparison Data

School report cards inevitably lead to comparison. Parents and educators want to see how a particular school rates in relation to national averages, state averages, district averages or similar schools. The district's leadership also may be interested in providing a comparison of performance against individual school goals.

In addition, the school's performance over previous years is desirable and relevant. Questions such as "How did the school do on this measure last year?" are common. When presenting multiple years of data, it may be possible to indicate a trend on particular measures.

The data should be presented with context in which to interpret it.

The issue to wrestle with here is deciding which comparisons are relevant and appropriate. At present, most school accountability report cards publish information from previous years and allow comparisons to be made between the school's performance and the state averages. District and national averages are less common, primarily because no national tests are given to all students in all states.

Summary Scores

As you present these assorted pieces of information, you must decide on the district level whether to combine all of the data into a summary score or to issue a grade for each school.

How to combine disparate data, particularly for schools that have different types of data elements and different scores from different tests is a tricky technical issue and should be attempted only when the summary measure can be established with good measurement qualities. In addition, the question of how to weight the different pieces of information to come up with a single summary score is important.

For the 1997-98 school year, a dozen states ranked all schools, usually into categorical assignments. That number will increase as education officials in California are currently devising a plan to create an Academic Performance Index score for each of the more than 8,000 schools in the state.

Desireable Indicators

When developing an accountability report card, focus on those measures that are under the school's control. No single indicator, including SAT9 test scores or ITBS test scores, is sufficient to measure all things or serve all purposes. Provide demographic data as a context only, not to be used as an excuse for poor performance on those measures that are under the school's control.

At the Center for Research on Evaluation, Standards and Student Testing, we've looked at hundreds of different report card representations and have seen a variety of features. Some are as limited and primitive as a typewritten page or two of data with some explanatory language thrown in. Others, showing evidence of professional typesetting and design, offer 16 to 18 pages of data for a given school.

However, most parents and members of the public won't read a school report of a dozen or more pages. People tend to want something brief but informative, with access o additional information as needed.

Typically, school report card data is presented in table formats with some explanatory text accompanying and maybe some comparison information, such as district averages or the previous year's performance. The few graphical representations consist of bar charts or trend lines.

When images are used, they typically relate to a single indicator, failing to present graphically the multivariate nature of the school. That is, there is generally an image for one indicator, another image for other indicators, and so forth. Th collection of information for a school is not presented in a holistic way.

In our examination of accountability report cards, we found that presenting the information in a more holistic way (see below article) could convey a great deal of information about a school in less space without losing clarity.

An appealing aspect of Connecticut's Strategic School Profile allows for school officials to write a fairly lengthy narrative for the school. Unfortunately, of the nearly 80 we sampled covering the 19961997 school year, only four wrote anything. The rest were empty. The opportunity was there to display information about the school that the reader otherwise wouldn't know, but school leaders did not take advantage.

Key Considerations

When you decide to craft an accountability report card for your schools or district, there are several important considerations:

* Define the audience.

For whom are the report cards intended? Educators and the public do not always agree on what information is important or how it should be represented. What is the purpose of the report card? This helps to determine what you should include and how you are going to represent it. Although a school report may serve multiple purposes, no single report card can serve all purposes or constituents. It is best to avoid trying to do so.

* Be clear about the purpose.

Is the report card intended for accountability or accounting? There is a difference. If its purpose is accountability, it should be more action focused. You want to say, "Here is where this school is now and this is the action we plan to take to get to where we want to be." This differs from accounting, where you're just stating the numbers.

* Carefully select indicators.

Determining who the report is intended to reach helps you to decide what the selection of indicators should be. If the audience is parents and taxpayers, keep in mind they tend to be less interested in things like professional development at the school. If it is designed to communicate with educators, providing information in terms of how the school is meeting it goals in the area of professional development opportunities would be relevant.

* Consider the format and presentation.

How easy is the report to interpret? Is the information easily digested yet accurately presented? The format will be governed by the method of delivery and how you choose to make the reports available once they are completed. As we've seen in our study, although several states are creating reports, people aren't looking at them. This is due partly to the fact the reports are not being made easily accessible to parents and the general public.

* Consider the availability and quality of existing data.

In our efforts to create school reports for Los Angeles-area school, we pulled together data from seven different sources. We found the data is not always maintained at the same level of analysis in each of the data sources. In addition, some data are not updated regularly or maintained in a way that ensures currency.

Some measures that you might like to see across the district or state may not be available, so you may want to consider setting up your data structures first and collecting your data over a period of time before starting the reporting process. You cannot have a good accountability system or school report card without good data.

* Consider the credibility of the messenger.

Interestingly, the Education Week report surveyed educators, taxpayers and parents to identify who was the most credible source of information for school report cards. Non-profit organizations were considered to be the most credible by the parents and taxpayers. The least credible source for all three groups was the local media. But educators and non-educators didn't agree on which groups were most credible. For the educators, the districts and principals rated pretty high. Parents and taxpayers had them rated a bit lower.

Richard Brown is a project director at the Center for Research on Evaluation, Standards and Student Testing at UCLA, Box 951522, Los Angeles, Calif. 90095.

A Report Card With a Dashboard Feel

In Los Angeles, at the behest of Mayor Richard Riordan, researchers at CRESST created a school report card to complement existing school accountability forms provided by the district. These reports were not intended to replace the district's efforts, but to provide information to the public in an augmented format.

Our approach was to provide a single-page, graphical representation for each school. Information on how to interpret the report comes in the form of a two-page summary preceding the report for each school.

We use a metaphor when describing these representations that is very much a part of Los Angeles life: driving an automobile. The report has the look of a dashboard, with multiple indicators represented by gauges, counters and dials. Each report is generated dynamically from a single data set compiled from a number of sources.

Gauges and Dials

Basically, the report card contains three types of objects. Bar chart gauges indicate standardized tests and achievement data and allow for national and districtwide comparisons. National comparisons are based on the mean percentile rank. Each bar represents a grade-level and subject-matter combination,

The color scheme is red, yellow and green to represent where the school is relative to the district average. A yellow bar indicates the measure is within the district average. If it is green, it is better than the district average. A red bar designates below-average standing.

Another type of information presented in this image is counter data--whole numbers, integers or ratios, such as the number of suspensions or expulsions in a given school year. These are represented graphically as an odometer.

The last type of information involves indicators represented by percentages. In this case we have measures such as attendance and graduation rates. These indicators are presented using speedometer-type dials with a range of values that use the same color-coding scheme. If the needle on the dial is in the red zone, it is below the district average. If it is in the yellow zone, it is around the district average, and the green values denote above-average standing.

The comparisons do not have to be against district averages. The comparison could be relative to school goals or the previous year's accomplishments.

The report card also includes a map that identifies the cluster assignment for each school in Los Angeles by geographic region. Red and green markings for special programs indicate whether a school offers after-school, before-school, evening or Saturday activities. It also indicates the number of students who participate in these activities.

Web Viewing

This single image conveys a lot of information. When you view these images on the Web, you can flip through the different school's and actually seethe. colors changing and the dials moving to different levels. That way you can see how schools compare to one another.

Report Card Prototype: Short, Well-Designed; Easy to Read

The research that Richard Brawn discusses in his article, "Creating School Accountability Reports," was conducted by A-Plus Communications, a consulting firm in Arlington, Va.

A-Plus asked parents and taxpayers nationwide two basic questions: What information do you want in order to hold schools in your community and state accountable for results? How do you want to receive this information?

Among its priorities, the public said it wants accountability reports to be short, well-designed and easy to read. (Parents and taxpayers also said school officials should have more detailed information available for those who want it.)

In response, A-Plus developed and tested a simple, six-panel, black-and-white prototype report card for the fictitious Jefferson Elementary School. An annotated version of the prototype, with discussion of the public's preferences by Adam Kernan-Schloss, president of A-Plus Communications, is presented below.

What We Look Like.

This section, which includes enrollment, staffing and demographic information, was rated the least useful section by the public. Many parents and taxpayers expressed particular concerns about spotlighting demographic data (such as race, primary language, poverty levels). While many educators say such information provides an important context for judging student performance, much of the public sees this context-setting as an excuse for lower performance or an inappropriate label for schools and students.

How Our Students Perform.

This section, which includes test scores along with attendance and promotion rates, was rated the most important section. Parents and taxpayers especially liked the trend data and the comparisons with district and state scores. At the same time, respondents also expressed strong concerns about relying too heavily on test scores as the primary measure of a school's performance. Indeed, when presented with 21 specific indicators of school performance, parents ranked test scores 6th--behind safety, teacher qualifications, class size, graduation rates and dropout rates. Our advice: Put test scores in a broader context that includes student graduation and promotion rates along with trend data and comparisons.

How We Spend Our Time.

This section, which includes information on course offerings, was rated the second most useful section. Parents seemed especially interested in knowing about the breadth of a school s curriculum, including special programs for bilingual, special education and gifted/talented students.

How We Spend Our Money.

This section, which includes information on class size, technology and professional staff, was rated the third most useful section. Parents and taxpayers seemed especially interested in knowing about the professional qualifications of teachers.

Our School's Environment.

This section, which focuses on safety, discipline and parent involvement, was rated the fourth most useful section. In a separate ranking of 21 specific indicators of school performance, parents and taxpayers ranked school safety as their top priority. The prominence of safety and discipline shouldn't be surprising as the Phi Delta Kappa/Gallup poll finds the same perennially. Because only 17 states include safety information on their report cards, this might be a particularly good area for local districts to supplement the state's incomplete information with their own.

What We Are Doing to Improve.

This section, which includes a narrative discussion of the school's accomplishments and special features, was rated the fifth most useful section. In small-group discussions, however, parents and taxpayers said this kind of "soft data" is a valuable supplement to the other information in the prototype report card.

The Bottom Line.

As we suggested in our research report, "Poll after poll shows that improving education is the public's top priority. Accountability reports that document these improvements provide education leaders with a magic moment to communicate with their community. The challenge is to take advantage of the moment."

A 16-page summary of the research ("Reporting Results: What the Public Wants to Know"), the full research report and a short video on the project are available from A-Plus Communications, 2200 Clarendon Blvd., Suite 1102, Arlington, Va. 22201, 703-524-7325. E-mail: adam@ksa.group.com; www.apluscommunications.com.

COPYRIGHT 1999 American Association of School Administrators
COPYRIGHT 2004 Gale Group

联系我们|关于我们|网站声明
国家哲学社会科学文献中心版权所有