Developing technical skill assessments: there has been vigorous debate regarding the merits of various assessment approaches and alternatives to national or industry exams. The entire field is working hard to increase the focus on technical skill measurement in order to provide clear evidence that CTE provides a unique value to students.
Hyslop, Alisha
[ILLUSTRATION OMITTED]
ONE OF THE BIGGEST CHALLENGES FACING THE CAREER AND TECHNICAL
EDUCATION (CTE) COMMUNITY AS IT WORKS TO IMPLEMENT THE 2006 PERKINS ACT
is responding to more rigorous requirements for reporting on CTE
students' technical skill attainment. The law requires that
measures be valid and reliable, and the technical skill attainment
measure is enhanced to focus on "career and technical skill
proficiencies, including student achievement on technical assessments,
that are aligned with industry-recognized standards, if available and
appropriate."
The U.S. Department of Education suggested in non-regulatory
guidance that states and locals use the number of CTE concentrators who
passed technical skill assessments aligned with industry-recognized
standards as their performance indicator to fulfill the new
requirements. While there has been vigorous debate regarding the merits
of various assessment approaches and alternatives to national or
industry exams, 44 states have decided to use the non-regulatory
guidance at the secondary level, and 33 states have made that decision
at the postsecondary level. Even though some states have chosen a
slightly different approach, the entire field is working hard to
increase the focus on technical skill measurement in order to provide
clear evidence that CTE provides a unique value to students.
Unfortunately, very few states have such a system of technical
assessments in place. While efforts are under way at the national level
to provide some assistance, many states are moving forward with efforts
to increase their ability to accurately measure the skills students gain
in CTE programs. Some states are working to develop a set of assessments
based on their own state standards, some are looking to align with
already existing national or industry assessments, and others are taking
a combination approach. Georgia already had an assessment system in
place at the postsecondary level, but there was no established,
statewide technical skill measurement system in place for high school
students. When the new Perkins law was passed, the state embraced the
challenge to build an assessment system from scratch and began working
furiously.
Mamie Hanson, grants program consultant with the Georgia Department
of Education's Division of Career, Technical and Agricultural
Education (CTAE), said state administrators "did a lot of research
to see which approach would yield the best results for students"
and considered a variety of assessment options. The question they
ultimately decided to measure was, "Does a student have the
necessary skills to enter the career pathway or occupational area and be
successful?" It was determined that a more sustainable level of
technical skill attainment could be measured after a student had
completed a sequence of courses, which led the state toward an
end-of-pathway assessment system aligned with its new "Peach State
Pathways."
During the 2007-2008 school year, Georgia began identifying a
system of valid and reliable third-party assessments that evaluate
industry-based standards. Beginning with eight career pathways, Subject
Matter Expert (SME) Panels were established to engage in a process of
identifying or developing appropriate technical assessments. Expert
panels included four to six representatives from secondary and
postsecondary education, business and industry and CTE administration.
Team members accomplished these tasks over four work sessions.
The first step was for the panels to identify existing assessments
corresponding to the career pathway. The panels were charged with
researching and evaluating current assessments, ensuring that exams were
valid and reliable, and using 43 criteria to determine exams'
usability.
Some of the criteria included:
* Is the assessment based on a set of industry competencies or
credentialing standards?
* What percentage of the competencies on the assessment aligns with
Georgia Performance Standards (GPS)?
* Are tests current and is there a revision schedule?
* Are there appropriate testing security procedures in place?
* Are there appropriate accommodations for special populations?
* Can the test be administered online and through paper copies?
* Can the testing organization provide accurate feedback regarding
performance for local and state reporting?
* Is the exam reasonably priced?
The final step was for the panel to review information gathered and
choose to use an existing assessment in its current form or to modify it
to better align with GPS. After an extensive evaluation process, SME
panels identified eight end-of-pathway assessments for Phase I Career
Pathways. Pilot testing for this first set of pathway assessments will
be undertaken in January 2009. Exams will be offered in an online,
multiple-choice format and will typically be 90 minutes in length. The
process used for the first eight pathways also began again in the fall
of 2008 with another set of 10 pathways. The formation of expert panels,
research into existing assessments, identification of appropriate
assessments and assessment piloting will be repeated each year to ensure
complete coverage of all 54 of Georgia's career pathways within
five years.
[ILLUSTRATION OMITTED]
Exams will be administered to pathway completers, which are those
students who complete three designated courses within a career pathway.
Local CTAE administrators will work with instructors to identify
eligible students. Georgia's CTAE Resource Network, a clearinghouse
which supports a variety of curriculum, assessment and professional
development activities, will assist state personnel with test
facilitation activities at the local level. The network will provide
proctor training, access to online testing procedures, and a means of
issuing and tracking certificates and licensures obtained by students.
While it is estimated that less than 1 percent of the state's
CTE students will take technical skill assessments during the 2008-2009
school year, this number will increase as additional assessments are
identified or developed. Hanson emphasized that the state hopes to offer
students national certifications in as many areas as possible to
increase the value of participation. Where those national certifications
are not available, state certifications with industry endorsements will
be developed. This will ensure that the skills students gain in
Georgia's CTE programs will be clearly recognized and valued by
employers across the state, which is one of the most important goals of
any assessment system.
See this month's Research Report on page 52 for a
comprehensive look at the progress that states are making in developing
secondary CTE standards systems.
Alisha Hyslop is ACTE's assistant director of public policy.
She can be contacted at ahyslop@acteonline.org.