Assessing student learning in community college honors programs using the CCCSE course feedback form.
Ross, Laura O. ; Roman, Marcia A.
INTRODUCTION
Academically talented students with impressive placement scores are
enrolling at community colleges in increasing numbers. The economy has
certainly played a role in this migration to two-year institutions,
where students can commute from home and pay lower tuition rates, but
other factors have also contributed to the change. Community colleges
have expanded their mission to meet the academic needs of this
population (Marklei; Boulard), and articulation agreements between
community colleges and universities have improved over the years (Kane).
More two-year institutions are offering honors programs for the
academically gifted students who will eventually transfer to four-year
universities (Beck). The benefits to community colleges of developing
and sustaining honors programs are many; according to Bulakowski and
Townsend, they include: (a) greater learning potential for strong
academic students; (b) higher retention of well-prepared students; (c)
higher transfer rates for honors students; (d) enhancement of the
institution's public image; and (e) increased respect from
four-year institutions (Beck; Bulakowski and Townsend; Boulard).
However, not all community college administrators and faculty
approve of honors programs in the community college setting. Opponents
claim honors programs are elitist, diverting resources and the best
professors to the academically gifted students. They argue that
community colleges--known for open and equal-access education--should be
identifying methods and resources to help all students learn better, not
just a few (Boulard; Evelyn; Outcalt; Selingo). While these arguments
may always exist, as budgetary pressures become increasingly difficult,
these voices become louder and often more persuasive.
COMMUNITY COLLEGE HONORS PROGRAMS AND ACCOUNTABILITY
Enrollments have increased at community colleges during the
economic downturn. Unfortunately, this increase has occurred at the same
time that states such as Florida, New Mexico, Rhode Island, and many
others have reduced their financial support for higher education; even
though their enrollments are up, community colleges have been forced to
cut expenses and eliminate programs (Bushong). Now more than ever it is
important to have valid and concrete methods of assessment for honors
programs (Lanier).
In Assessing and Evaluating Honors Programs and Honors Colleges: A
Practical Handbook, Otero and Spurrier state, "Evaluation and
assessment provide an opportunity for Honors Programs and Honors
Colleges to demonstrate their strengths, address their weaknesses,
generate institutional support, and gain outside validation of their
accomplishments and goals" (p. 5). They suggest a two-phase
evaluation process: a self-study and then an external study by a team of
NCHC-recommended Site Visitors. In the self-study report, Otero and
Spurrier recommend that the honors program or honors college develop
goals and objectives, gather evidence of accomplishing those objectives,
and identify strategies for improvement. For many programs, the
gathering of evidence is a precarious part of the self-study. Whipple encouraged well-conducted self-assessment of programs but cautioned,
"Assessment, poorly planned and executed, wastes time and money,
and may misinform, leading to faulty conclusions" (p.41).
The Art and Phyllis Grindle Honors Institute at Seminole Community
College (SCC) in Florida has more than doubled in size over the last
four years. The program has enhanced its curriculum, expanded to two
campuses, hosted the Florida Collegiate Honors Council Conference, and
had four consecutive Jack Kent Cooke Scholars and one All-USA Community
College Academic Team Member. Despite its impressive record, the SCC
Honors Program is scrambling, along with every other worthy program, to
develop measurable student-learning outcomes, gather evidence, and
assess student learning for accreditation self-study requirements and
for its administration. The program has written goals, objectives, and
methods of assessment in place, but had to search for a valid and
relevant assessment tool to understand how students are learning in the
honors classes compared to traditional classes. By knowing this
information, the program could better document evidence of student
learning, determine curricular or pedagogical changes, improve or
maintain strong retention rates, and perhaps justify the budget
resources directed to honors.
For several reasons, the SCC Honors Institute decided to adopt the
Center for Community College Student Engagement (CCCSE) Course Feedback
Form as an assessment tool. First, the CCCSE Course Feedback Form was
cost-effective (free) and could be downloaded from the CCCSE website.
Second, our college already recognized the CCCSE Community College
Student Report (CCSR) as a valid instrument and used it as an assessment
tool, and the CCCSE Course Feedback Form was based on questions from the
CCCSE Community College Student Report (CCSR). Finally, the questions on
the CCCSE Course Feedback Form solicited responses from students about
their learning experiences and engagement in the classroom.
STUDENT ENGAGEMENT
Research has shown that the more actively engaged students
are--with faculty, staff, other students, and the subject matter--the
more likely they are to learn and to achieve their academic goals (CCSSE Institutional Report, 2004; Astin; Pace, as noted in Kuh; Pascarella and
Terenzini).
The Center for Community College Student Engagement (CCCSE) was
launched in 2001 under the name of the Community College Survey of
Student Engagement (CCSSE) as a project of the Community College
Leadership Program based at The University of Texas at Austin. Grants
from The Pew Charitable Trusts, the Lumina Foundation for Education, the
MetLife Foundation, and Houston Endowment supported the effort. The
purpose was to stimulate dialogue about how quality is defined and
measured, to provide an appropriate assessment tool, and to raise public
awareness about the work of community colleges.
Considered the "daughter" of the National Survey of
Student Engagement (NSSE), which is used by four-year institutions to
obtain information about learning practices and student engagement, the
CCSSE addresses the unique mission and student characteristics of
community colleges (Ouimet, p. 8). The purpose of the instruments is to
provide information about effective educational practices and promote
practices demonstrated to improve student learning and retention
(McClenney, p. 138).
The CCCSE and NSSE survey instruments are based on the work of many
researchers, including Pace's seminal 1984 work on student effort,
Astin's work (1984, 1993, 1999) on student involvement, and
Chickering and Gamson's 1987 landmark publication on good practices
of undergraduate education (Kuh, p. 2). The seven principles of good
practice were developed by a task force of scholars of policy,
organizational, and economic issues in higher education as well as
others who had conducted research on the college experience (Chickering
and Gamson, 1999, p. 76). The principles or "engagement
indicators" (Kuh, p.1) include: encouraging student-faculty
contact; reciprocity and cooperation among students; active learning;
prompt feedback; time on task; communication of high expectations; and
respect for diverse talents and ways of knowing (Chickering and Gamson,
1987, p. 3).
The CCCSE survey instrument, the Community College Student Report
(CCSR), is a research-based tool that can be useful for benchmarking
performance and monitoring progress of improvement efforts by comparing
results not only to other institutions but within an institution from
one administration to another (Ouimet, p. 8). CCCSE cautions
institutions in their use of data and advises that comparison for
purposes of ranking is inappropriate.
While CCSR results provide institutional assessment data that can
be disaggregated by demographic factors such as ethnic groups,
first-generation college students, and developmental or
college-preparatory students, CCCSE's Course Feedback Form provides
a vehicle for individual course and program-level assessment. The Course
Feedback Form was developed in response to requests from community
colleges with the assistance of a CCSSE advisory group and is closely
aligned with the CCSR (McClenney, pp. 140-41). The Course Feedback Form
is password protected and available free of charge to any former or
current CCCSE-member college in the Toolkit found under Resources on
CCSSE's web site at <http://ccsse.org>. The University of
Alabama has collaborated with NSSE to develop a classroom-level
adaptation of their survey instrument, called the Classroom Survey of
Student Engagement (CLASSE) for use by four-year institutions. It also
is available free of charge to past and current participants of the
NSSE. Information is available at
<http://assessment.ua.edu/CLASSE/Overview>.
STATEMENT OF THE PROBLEM
A key question is what assessment resources are available to
improve curricular programs, including honors programs, that strive to
improve student learning. A growing body of research shows that student
engagement is related to improved student learning and persistence. An
exploratory study conducted by Long and Lange demonstrated statistically
significant differences between honors and non-honors students in
academic focus, student interaction, and student activity. But while
these students may already be more engaged and exhibit higher retention
rates than non-honors students (Long and Lange), the question remains
how to assess and improve the educational practices of these students
and honors programs.
Anchored in research, and with our institution already examining
the CCCSE data in order to make improvements in student learning and
retention, the SCC Honors Program believed that the CCCSE survey and the
CCCSE Course Feedback Form could be used to specifically target
assessment and improvement of honors classes. Although the SCC Physical
Therapy Assistant program used the CCCSE Course Feedback Form in its
self-study in preparation for re-accreditation by the Commission on
Accreditation in Physical Therapy Education, it has not been widely
adopted across the College.
RESEARCH QUESTIONS
The study was guided by two research questions:
1. How do SCC honors students' responses on the CCCSE Course
Feedback Form compare to the general SCC college-credit-student
population's responses to the institution-level Community College
Student Report?
2. Based on aggregated student responses to the CCCSE Course
Feedback Form, what areas might the honors program consider addressing
to improve student engagement and therefore the student learning and
retention of its honors students?
DEFINITION OF TERMS
* CCSR is the Community College Student Report, which is the survey
instrument used by CCCSE for institutional assessment.
* CCCSE is the Center for Community College Student Engagement. It
was launched in 2001 under the name Community College Survey of Student
Engagement (CCSSE).
* CCSSE is the Community College Survey of Student Engagement and
was launched in 2001 as a project of the Community College Leadership
Program at The University of Texas at Austin. The name was officially
changed to the Center for Community College Student Engagement (CCCSE)
in spring 2009.
* CCCSE's Course Feedback Form is an end-of-course evaluation
instrument developed with the assistance of an advisory panel to provide
a tool for course-level and program-level assessment. The instrument
shares thirty-nine questions with the Community College Student Report
and contains additional questions that pertain to the specific course.
It is intended for local administration and analysis (Retrieved 6/7/08
from <http://www.ccsse.org/publications/toolkit.cfm>).
* Engagement is the quality of effort students devote to
"educationally purposeful activities that contribute directly to
desired outcomes" (Hu and Kuh, p. 555).
DESIGN AND METHODOLOGY
DESIGN OF THE STUDY
The CCCSE survey, the CCSR, was administered according to survey
protocols in spring 2007. The surveys were sent to CCSSE for data
compilation, and Seminole Community College received its results by fall
2007. CCCSE provides participating colleges with an extensive dataset of
their institution's results, including the mean scores of student
responses to each survey item.
The Course Feedback Form was administered in all honors courses at
Seminole Community College in summer 2007, fall 2007, and spring 2008,
totaling seventeen sections. The honors courses cut across various
disciplines including composition and literature, economics, psychology,
sociology, speech, humanities, history, and biology. The college's
Institutional Research Office compiled the data and provided mean scores
of student responses to each survey item for each honors course as well
as an overall mean score of all honors courses for each survey item.
The authors developed a cross-walk between the CCSR and the Course
Feedback Form in order to identify the survey items that were the same
and those that were unique to the Course Feedback Form. Thirty-nine
survey items were found to be the same, including five questions
pertaining to College Experience and Demographics. These five questions
were not examined in this study, so the study consisted of examining
mean scores from thirty-four of the survey items.
In order to establish whether the honors students were similar as a
group across semesters, the mean scores (by semester) of each item on
the Course Feedback Form were examined to determine if there were
statistically significant differences in student responses to each item
between terms. Few or no statistically significant differences between
terms on the thirty-four items examined would imply that honors students
across all terms were similar and would support the plan to examine all
honors students' responses to the Course Feedback Form in this time
period as a group.
The mean scores of students' responses to each item for each
of the honors classes, as well as the overall honors mean score on each
item, were compared to the mean scores of student responses to the
institutionally-administered CCSR. Although the student population in
honors courses is different from the population of students who
responded to the CCSR, it seemed a valid comparison conducted in a
cursory manner to determine if the data did, in fact, show honors
students to be more engaged in honors classes than students in other
courses. The mean scores of the individual items on the Course Feedback
Form used in the honors courses were also compared to the overall mean
score for all honors courses as part of the honors program assessment.
RELIABILITY AND VALIDITY
CCCSE's instrument, the Community College Student Report, has
its genesis in NSSE's instrument, the College Student Report, and
shares a number of common survey questions. The score reliability and
validity of the NSSE have been extensively explored and demonstrated
(Kuh, 2002, as noted in Marti, 2004, P. 1).
The score reliability of the CCSR and its component benchmarks were
measured through use of Cronbach's alpha (Marti, 2004, p. 14).
Cronbach's alpha values for the five survey benchmarks are strong
despite not all exceeding the "gold standard of .70" (Marti,
2009, p. 11).
Test-retest reliability was evaluated by comparing students'
responses to the survey administered in more than one of their classes
although only one survey from each individual was included in overall
analyses (Marti, 2009, p.
11). Year-to-year comparisons between 2003, 2004, and 2005 indicate
that the instrument is measuring the same constructs across time and
that differences between subgroups are due to real differences in means,
variances, and co-variances as opposed to problems associated with the
instrument (Marti, 2009, p. 14). A major validation research study of
CCCSE's survey was recently completed that demonstrated a
relationship between student responses to survey items and student
outcomes (McClenney, p. 140).
Nearly seventy percent of the survey items on the CCCSE Course
Feedback Form are the same as items on the CCCSE Community College
Student Report. The reliability and validity of NSSE and CCCSE
institutional surveys lend credence to the reliability and validity of
the CCCSE Course Feedback Form.
SIGNIFICANCE OF THE STUDY
The CCCSE Course Feedback Form provides a means of research-based
course-level and program-level assessment. By collecting data through
CCCSE's Course Feedback Form across all honors classes in summer
2007, fall 2007 and spring 2008, the authors were able to examine not
only course-but program-level data for the honors courses. The CCCSE
Course Feedback Form provides a research-based means to assess
individual classes and a program to provide a basis for continued
improvement and gains in student learning.
Using CCCSE's CCSR and the Course Feedback Forms together, an
institution can assess student engagement and thereby student learning
at the institutional level as well as by individual course or program.
RESULTS
Frequencies were conducted on both the Community College Student
Report and the CCCSE Course Feedback Form. The CCSR had n=829 with 447
female and 294 male responses. There were 72 Black students, 127
Hispanic students, 461 White students, and 73 students who reported
other race and ethnicities. Course Feedback Form frequencies indicated
260 responses from honors students across the 17 sections surveyed. For
the students who included demographic information, 161 were female, 80
male, 15 Black, 22 Hispanic, 112 White, and 71 students who reported
other races and ethnicities. The "other" category was large
for the Course Feedback Form because the summer session forms failed to
include a category for Hispanics. All survey results were included in
the analysis of Course Feedback Forms and no attempt was made to use
only one survey per student across all 17 sections.
In order to establish whether the honors students were similar as a
group across semesters, the mean scores (by semester) of each item on
the Course Feedback Form were examined through an analysis of variance conducted by the authors to determine if there were statistically
significant differences in student responses to each item between terms
(see Appendix). Statistically significant differences were found through
the omnibus F -test in 9 of the 34 survey items, or 26% of the items
examined. Multiple Comparison Procedures indicated that there were more
differences between the responses of honors students from summer 2007 to
fall 2007 than there were between other groups examined. Despite these
differences in their initial examination of CCCSE Course Feedback Form
data, the authors chose to examine the honors student responses from all
three semesters as a group in the comparison with institutional CCSR
results. The authors also included all Course Feedback Form responses.
Since a number of honors students were in several different honors
classes in which the survey was administered, more than on Course
Feedback Form per student is included in the results. The CCSR survey
protocol requires that only one survey per student is included in
institutional results.
Although the student population in honors courses is different from
the population of students who responded to the CCSR, we could make a
rough determination if the data showed honors students to be more
engaged in honors courses than non-honors students in other classes. The
results would also show if and how honors students were less engaged
than the larger student population at the college. Such information
could serve as the basis for improvement of the honors program.
One of the research questions guiding this study was: How do SCC
honors students' responses on the CCCSE Course Feedback Form
compare to the general SCC college-credit-student population's
responses to the institution-level Community College Student Report?
To answer this question, the researchers examined 34 questions on
the CCCSE Course Feedback Form that are the same as questions on the
institution-level Community College Student Report. The mean scores of
SCC honors students' responses to the CCCSE Course Feedback Form
(n=260) as compared with SCC students' responses to the CCCSE
Community College Student Report (n=829) indicated more engagement with
faculty, students, and learning activities on 29 of 34 identical
questions from the CCCSE Course Feedback form and the Community College
Survey Report. The survey items are categorized by CCCSE into three
groups, which are presented in Tables 1, 2, and 3.
In order to answer the second research question, the researchers
more closely examined the survey items in which honors students'
responses to questions on the CCCSE Course Feedback Form indicated less
engagement than other students' responses to the same question on
the CCSR. Five items indicated less engagement by honors students
surveyed in the honors courses (see Table 4).
CONCLUSIONS AND DISCUSSION
Despite the following types of limitations of this study, the CCCSE
Course Feedback Form, used in conjunction with the CCCSE Community
College Student Report, seems a promising tool for assessing courses and
programs given its ability to measure learning gains made after
curriculum adjustments based on assessment data. Limitations include:
1. CCCSE's survey, the Community College Student Report, asked
students to consider their experience over an entire academic year and
across all of their classes while the Course Feedback Form requested
feedback on a specific course within a given term.
2. Only one survey per student is used in analyses of the Community
College Student Report while all student responses to the CCCSE Course
Feedback Form administered to different honors classes, including those
by the same student in different classes, were used in the analyses of
the Course Feedback Form.
3. Data from CCCSE's Community College Student Report and the
CCCSE Course Feedback Form are self-reported.
4. Honors students applied and were selected for admission into the
honors program while students who responded to the institution-level
Community College Student Report are subject to open admissions policies
and not selected according to academic performance.
5. Honors students are required to take a one-credit orientation
course that is a modified type of First-Year Experience while the
general college population does not take such a course.
6. Data analyses of the CCCSE Course Feedback Forms were conducted
by the College's Institutional Research Office and the authors.
National CCCSE data show that honors students are already reaping
some of the greatest benefits of what community colleges have to offer
and are highly engaged (Arnspargar, Slide 25). Honors students'
responses to 29 of 34 questions on the CCCSE Course Feedback Form
administered in this study indicated that they were more engaged in
honors courses than nonhonors students in general courses. Honors
students responded that they asked more questions in class, prepared
more drafts of papers, worked harder than they thought they could, and
discussed ideas from the class with others outside of class. Honors
students also indicated through the Course Feedback Form that their
honors courses emphasized critical thinking skills, such as analysis,
synthesis, argumentation, and problem solving, much more than the
traditional courses. This evidence from the Course Feedback Form will
help the SCC Honors Program to document the high-level learning that is
occurring in honors classrooms. The data will also help to assess the
program's new student learning outcomes, which coordinate with the
Course Feedback Form's critical thinking questions 2b, 2c, 2d, and
2e (see Table 2).
The mean scores of honors students' responses on two items are
noteworthy because a lower mean indicates a higher level of engagement.
Alower mean score of honors students' responses to item 1o on the
Course Feedback Form, which pertains to skipping class, indicates a
higher level of engagement. Likewise, a lower mean score of honors
students' responses to item 2a on the Course Feedback Form, which
pertains to mental activities involving critical thinking, compared to
all student responses to item 5a on the CCSR, indicates a higher level
of engagement. Responses are based on a 4-point scale (see Tables 1, 2
and 3).
However, there were five responses on the Course Feedback Form
indicating less engagement for SCC honors students. The authors chose to
focus initially on two of the five items for program improvement. These
were items 1k and 3l on the Course Feedback Form which pertain to career
plans and goals (see Table 4). These items were given priority because
of the important association of career decision and persistence
(Sandler, p. 564).
Honors students reported on item 1k of the Course Feedback Form
that they discussed career plans with instructors less frequently than
the general population reported in the CCSR. While the general
population responding to the CCSR included students in career and
technical programs, such as nursing, criminal justice, and computer
technology, the honors students' responses were still of concern.
A second item indicating less engagement by honors students
according to the CCCSE Course Feedback Form pertained to whether the
course helped students develop clearer career goals. Honors students
indicated that honors classes were not helping them develop clear career
goals to the extent that the general population experienced in their
classes. That there were two items pertaining to career plans and goals
in which honors students seem less engaged was noteworthy. Because
honors courses often provide opportunities for exploration, careers may
be an area where SCC honors could work to improve engagement. While
professors certainly are not the only source of career information, they
may find ways to connect their subjects with various disciplines or
careers. Students can also be encouraged, perhaps in the honors
orientation class, to seek advice from faculty about academic paths and
career choices. The Program is also considering a series of one-hour
seminars for honors students to spend time with guest speakers from
different professional careers.
While the SCC Honors Program will continue to examine a variety of
data on honors students and courses, including demographics, course
completion rates, grade point averages, retention rates, and graduation rates, the Course Feedback Form is helpful because it addresses the
learning occurring in the classroom. In addition, the Course Feedback
Form provides a way to compare the honors students' responses with
the responses of the general population in the college. The data can
also be benchmarked with aggregated data from the state and national
levels.
Of course, the data from the Course Feedback Form are
student-reported; therefore, the Course Feedback Form results need to be
examined in conjunction with the many assessment methods used by faculty
and the data collected by the SCC Institutional Research Department.
These assessment methods assist in determining student engagement and
student success, but they also document the added value of an honors
program and provide justification for the budget resources allocated to
honors, which is especially important during a difficult economic time.
APPENDIX
MEAN SCORES TO DETERMINE IF HONORS STUDENTS IN SUMMER,
FALL, AND SPRING TERMS WERE SIMILAR AS A GROUP
ANOVA results of CCCSE Course Feedback Form mean scores
between Summer, Fall and Spring results for Honors students.
CCSR Course F Test
2005-2007 Feedback Question
Form
College Activities: Academic, Intellectual and Social
Experiences
1. In your experiences with this class during the current
semester, how often did you do the following?
4a 1a Asked questions in class or .310 *
contributed to class discussion
4b 1b Made a class presentation .000
4c 1c Prepared two or more drafts of an .002
assignment b4 turning it in
4d 1d Worked on papers that require .003
integrating ideas or
information. . . .
4f 1e Worked with other students on .002
projects during class
4g 1f Worked with classmates outside of .004
class to complete the assignment.
4i 1g Participated in a community-based .020
project as part of your coursework
4j 1h Used the internet to work on an .000
assignment
4k 1i Used e-mail to communicate with your .000
instructor
4l 1j Discussed grades or assignments with .200 *
your instructor
4m 1k Talked about career plans with your .339 *
instructor
4n 1l Discussed ideas from your readings .289 *
or class with your instructor
outside of class
4p 1m Worked harder than you thought you .120 *
could to meet the instructor's
standards or expectations
* Indicates statistically significant with p > .05
CCSR Course Question F Test
2005-2007 Feedback
Form
College Activities: Academic, Intellectual and Social
Experiences
1. In your experiences with this class during the current
semester, how often did you do the following?
4r 1n Discussed ideas from the readings .010
or class with others outside of
class (students, family members,
co-workers, etc.)
4u 1o Skipped class .016
4o 1p Received prompt feedback from your .267 *
instructor about your performance
* Indicates statistically significant with p > .05
CCSR Course Question F Test
2005-2007 Feedback
Form
Character of Mental Activities
2. During this current semester, how much has this
course emphasized the following?
5a 2a Memorizing facts, ideas, or methods .000
from your courses and reading so
that you can repeat them in pretty
much the same form
5c 2c Synthesizing and organizing ideas, .013
information, or experiences in new
ways
5d 2d Making judgments about the value or .000
soundness of information,
arguments, or methods
5e 2e Applying theories or concepts to .003
practical problems or in new
situations
5f 2f Using information you have read or .008
heard to perform a new skill
* Indicates statistically significant with p > .05
CCSR Course Question F Test
2005-2007 Feedback
Form
Educational and Personal Growth: Knowledge, Skills and
Personal Development
3. During this current semester, to what extent did this
course help you develop in the following areas?
4a12c 3a Writing clearly and effectively .153 *
12d 3b Speaking clearly and effectively .002
12e 3c Thinking critically and .000
analytically
12f 3d Solving numerical problems .047
12g 3e Using computing information .557
technology
12h 3f Working effectively with others .043
12i 3g Learning effectively on my own .048
12j 3h Understanding myself .008
12k 3i Understanding people of other .000
racial and ethnic backgrounds
12l 3j Developing a personal code of .045
values and ethics
12m 3k Contributing to the welfare of the .589 *
community
12n 3l Developing clearer career goals .555 *
* Indicates statistically significant with p > .05
REFERENCES
Arnspargar, A. (November 2007). Act on fact: Using data to improve
student success. Presentation to the Florida Community Colleges,
Orlando, FL.
Astin, A.W. (1984). Student involvement: A developmental theory for
higher education. Journal of College Student Personnel, 24, 297-308.
Astin, A.W. (1993). What matters in college: Four critical years
revisited. San Francisco: Jossey-Bass Publishers.
Astin, A.W. (1999). Involvement in learning revisited: Lessons we
have learned. Journal of College Student Development, 40, 587-598.
Beck, E. (2003). It's an honor. Community College Week, 15,
4-5.
Bulakowski, C. and Townsend, K. (1995). Evaluation of a community
college honor program: Problems and possibilities. Community College
Journal of Research and Practice, 6, 485-99. Abstract obtained from The
H.H. Wilson Company/Wilson Web, 1995, Abstract No. 199530503789001.
Boulard, G. (2003, January 6). The honorable thing to do? Community
College Week, 15, 6-9.
Bushong, S. (2009, January 23). Community-College Enrollments Are
Up, but Institutions Struggle to Pay for Them. The Chronicle of Higher
Education, 55, A24.
Chickering, A.W. and Gamson, Z.F. (1987). Seven principles for good
practice in undergraduate education. American Association of Higher
Education Bulletin, 39, 3-7.
Chickering, A.W. and Gamson, Z.F. (1999). Development and
adaptations of the seven principles for good practice in undergraduate
education. New Directions for Teaching & Learning, 80, 75-81.
Community College Survey of Student Engagement (2004b).
Institutional Report 2004. Austin: TX. University of Texas at Austin.
Community College Survey of Student Engagement (2006d). Why focus
on student engagement? Retrieved 4/9/06 from <http://www.ccsse.org/
aboutccsse/engage.cfm>.
Evelyn, J. (2002, October 4). An elite vision. The Chronicle of
Higher Education, 49, A31-32.
Hu, S., and Kuh, G.D. (2002). Being (Dis)engaged in educational
purposeful activities: The influences of student and institutional
characteristics. Research in Higher Education, 43, 555-575.
Kane, H. (2001). Honors programs: A case study of transfer
preparation. New Directions for Community Colleges, 114, 25-38.
Kuh, G.D. (2003). The National Survey of Student Engagement:
Conceptual framework and overview of psychometric properties. Indiana
University Center for Postsecondary Research and Planning. Retrieved
2/18/06 from <http://www.nsse.iub.edu/pdf/conceptual_framework.2003.pdf>.
Lanier, G.W. (2008). Towards reliable honors assessment. Journal of
the National Collegiate Honors Council, 9, 81-149.
Long, E.C.J and Lange, S. (2002). An exploratory study: A
comparison of Honors & non-honors students. The National Honors
Report, 23, 20-30.
Marklein, M. B. (2003, June 10). Two-year schools aim high. USA
Today, p. 10.
Marti, C.N. (2004). Overview of the CCSSE instrument and
psychometric properties. Community College Survey of Student Engagement
website. Retrieved 1/29/05 from
<http://www.ccsse.org/aboutsurvey/psychomet rics.pdf>.
Marti, C.N. (2009). Dimensions of student engagement in American
community colleges: Using the Community College Student Report in
research and practice. Community College Journal of Research and
Practice, 33, 1-24.
McClenney, K.M. (2007). The Community College Survey of Student
Engagement. Community College Review, 35, 137-146.
Oiumet, J.A. (Nov-Dec 2001). Assessment measures: The community
college survey of student engagement. Assessment Update, 13, 8-9.
Otero, R. and Spurrier, R. (2005). Assessing and evaluating honors
programs and honors colleges: A practical handbook. Lincoln, Nebraska:
University of Nebraska, National Collegiate Honors Council.
Outcalt, C. (1999). The importance of community college honors
programs. New Directions for Community Colleges, 108, 59-68.
Pace, C.R. (1984). Measuring the quality of college student
experiences. Los Angeles: University of California, Higher Education
Research Institute.
Pascarella, E.T. and Terenzini, P.T. (1991). How college affects
students: Findings and insights from twenty years of research. San
Francisco: Jossey-Bass Publishers.
Sandler, M.E. (2000). Career decision-making, self-efficacy,
perceived stress, and an integrated model of student persistence: A
structural model of finances, attitudes, behavior and career
development. Research in Higher Education, 41, 537-582.
Selingo, J. (2002, May 31). Mission creep? The Chronicle of Higher
Education, 48, A19-21.
Whipple, W. (2003). Using assessment properly. The National Honors
Report, 23, 41.
LAURA O. ROSS AND MARCIA A. ROMAN
SEMINOLE COMMUNITY COLLEGE
The authors may be contacted at RossL@scc-fl.edu.
Table 1: Mean Scores of Academic, Intellectual and Social
Experiences (CCSSE, 2007) for Honors Students and Non-Honors
Students
CCSR Course During the current SCC Mean SCC
2005-2007 Feedback semester, how often 2007 Honors
Form did you do the (CCSR) Mean
following? (Course
Question # Feedback
Form)
4a 1a Ask questions in class 2.89 3.05
4b 1b Make a class 2.25 2.31
presentation
4c 1c Prepare two or more 2.53 2.64
drafts of an assignment
4d 1d Worked on papers that 2.90 3.29
require integrating
ideas or information
from various sources
4f 1e Work with other 2.34 2.75
students on projects
during class
4g 1f Work with classmates 1.82 2.41
outside of class to
complete an assignment
4i 1g Participate in a 1.26 1.68
community-based project
as part of coursework
4j 1h Use the Internet to 2.89 3.45
complete an assignment
4k 1i Used e-mail to 2.52 2.40
communicate with your
instructor
4l 1j Discussed grades or 2.54 2.42
assignments with your
instructor
4m 1k Talked about career 2.05 1.76
plans with your
instructor
4n 1l Discussed ideas from 1.77 2.18
your readings or class
with your instructor
outside of class
4p 1m Worked harder than you 2.43 2.70
thought you could to
meet your instructors
standards or
expectations
4r 1n Discussed ideas from 2.54 2.85
the readings or class
with others outside of
class (students, family
members, co-workers)
4u 1o Skipped class 1.69 1.15
4o 1p Received prompt 2.63 2.95
feedback from your
instructor about your
performance
Scale: 1 = Very little; 2 = Some; 3 = Quite a bit; 4 = Very much
Table 2: Mean Scores of Character of Mental Activities
(CCSSE, 2007) for Honors and Non-Honors
CCSR Course During the semester, SCC Mean SCC
2005-2007 Feedback how much have your 2007 Honors
Form courses emphasized the (CCSR) Mean
following? (Course
Question # Feedback
Form)
5a 2a Memorizing facts,
ideas, or methods from 2.76 2.36
your courses and
reading so that you
can repeat them in
pretty much the same
form
5b 2b Analyzing the basic 2.84 3.22
elements of an idea,
experience, or theory
5c 2c Synthesizing and
organizing ideas, 2.72 3.21
information, or
experiences in new ways
5d 2d Making judgments about
the value or soundness 2.62 3.21
of information,
arguments, or methods
5e 2e Applying theories or
concepts to practical 2.65 2.90
problems or in new
situations
5f 2f Using information you 2.65 2.69
have read or heard to
perform a new skill
Scale: 1 = Very little; 2 = Some; 3 = Quite a bit; 4 = Very much
Table 3: Mean scores of items pertaining to Knowledge,
Skills and Personal Development (CCSSE, 2007) for
Honors and Non-Honors Students
CCSR Course During the current SCC Mean SCC
2005-2007 Feedback semester, to what 2007 Honors
Form extent did this course (CCSR) Mean
help you develop in the (Course
Question # following areas? Feedback
Form)
12c 3a Writing clearly and 2.66 2.70
effectively
12d 3b Speaking clearly and 2.60 2.74
effectively
12e 3c Thinking critically 2.86 3.23
and analytically
12f 3d Solving numerical 2.62 2.17
problems
12g 3e Using computing 2.58 2.62
information technology
12h 3f Working effectively 2.55 2.87
with others
12i 3g Learning effectively 2.83 2.89
on my own
12j 3h Understanding myself 2.53 2.68
12k 3i Understanding people 2.29 2.84
of other racial and
ethnic backgrounds
12l 3j Developing a personal 2.27 2.59
code of values and
ethics
12m 3k Contributing to the 1.88 2.52
welfare of the
community
12n 3l Developing clearer 2.49 2.39
career goals
Scale: 1 = Very little; 2 = Some; 3 = Quite a bit; 4 = Very much
Table 4: Mean Scores Indicating Less Engagement
for SCC Honors Students
CCSR Course During the current SCC Mean SCC
2005-2007 Feedback semester, how often 2007 Honors
Form did you do the (CCSR) Mean
following? (Course
Question # Feedback
Form)
4k 1i Used e-mail to 2.52 2.40
communicate with your
instructor
4l 1j Discussed grades or 2.54 2.42
assignments with your
instructor
4m 1k Talked about career 2.05 1.76
plans with your
instructor
12f 3d Solving numerical 2.62 2.17
problems
12n 3l Developing clearer 2.49 2.39
career goals
Scale: 1 = Very little; 2 = Some; 3 = Quite a bit; 4 = Very much