The impact of computer literacy on student academic performance in the introductory management information systems course.
Ye, L. Richard
COMPUTER LITERACY AND PERFORMANCE
"Computer literacy" is a commonly used term in the
business world, but it is not precisely defined. Computer literacy, in
general, is being knowledgeable about the computer and its applications
(Rochester & Rochester, 1991). Such knowledge appears to have two
dimensions: conceptual, and operational (Winter, Chudoba, &
Gutek,1997). The conceptual dimension includes an understanding of the
inner workings of a computer or general computer terminology. Without
such knowledge a user would find it difficult to figure out any system
problems, or to learn to adapt quickly to new systems or software. The
operational dimension refers to the necessary skills a user acquires,
through training and practice, in order to operate specific systems to
complete specific tasks.
While prior research did not evaluate the performance impact of
computer literacy empirically, there is evidence that such a performance
impact is likely to be task-dependent (Goodhue & Thompson, 1995;
Lonstreet & Sorant, 1985; Rhodes, 1985; Thompson, Higgins, &
Howell, 1994). For example, if we considered a student to be highly
computer literate because s/he demonstrated a high level of proficiency in using a word processor or a spreadsheet program, we would also expect
the student to perform well on tasks involving the use of a word
processor or a spreadsheet program. We could not predict, however, how
the student would perform on tasks involving the use of a database
program, if s/he had not received training in database software. This
leads us to the following hypothesis:
H1: Students' task performance will be positively correlated
with their level of computer literacy, if the same type of software is
involved in assessing their level of computer literacy and their task
performance.
Winter, Chudoba, and Gutek (1997) use the notion of
"functional computer literacy" to argue that a user needs both
the conceptual and operational knowledge to perform effectively and
productively in various white-collar work settings. A truly
"computer fluent" user, they contend, does not simply memorize the correct sequence of keystrokes or mouse clicks. Rather, the user
must form an internal representation of the system's structure and
functions. Indeed, there is consistent research evidence that links a
user's valid mental models of a system to better task performance
(Foss & DeRidder, 1988; Booth, 1989; Sein & Bostrom, 1990;
Weller, Repman, Lan, & Rooze, 1995). Within the context of computer
literacy training, we would therefore expect students to form useful
mental models of a computer system based on their conceptual knowledge
of the system, and to be able to transfer that knowledge to tasks in an
unfamiliar hardware/software environment. This leads us to a second
hypothesis:
H2: Students' task performance will be positively correlated
with their level of computer literacy, even if the task involves the use
of unfamiliar software.
The foregoing arguments suggest that the performance impact of
students' computer literacy depends on the nature of the task. More
specifically, it depends on whether the task involves the transfer and
application of the conceptual and operational knowledge obtained from
their computer literacy training and practice. This gives rise to a
third hypothesis:
H3: Students' task performance will have no correlation with
their level of computer literacy, if the task does not require the use
of their conceptual or operational knowledge of the computer
hardware/software.
METHOD
A multiple regression analysis was applied to assess the
significance of students' level of computer literacy in predicting
their task performance. In addition to the primary independent variable,
level of computer literacy, the analysis also included two other
independent variables: gender, and grade point average.
A number of prior studies have investigated the impact of student
gender as a predictor of academic performance, but the results appear
inconclusive. Two earlier studies found that female students performed
better than males in accounting (Mulchler, Turner, & Williams, 1987;
Lipe, 1989), while others found males outperforming females in finance
(Borde, Byrd, & Modani, 1996) and Economics (Heath, 1989), but no
gender effect in marketing (Borde, 1998).
Extensive research also exists on gender-related computer attitudes
and aptitude, with more consistent results. Several studies found that,
compared to men, women tend to display lower computer aptitude (Rozell
& Gardner, 1999; Smith & Necessary, 1996; Williams, Ogletree,
Woodburn, & Raffeld, 1993) and higher levels of computer anxiety
(Anderson, 1996; Bozionelos 1996; Igbaria and Chakrabarti 1990). Because
the present study focuses on students' performance in an
information systems course, we include gender in the research model so
its effect can also be explored.
Grade Point Average (GPA) as a predictor of academic performance is
widely reported. Numerous studies have found GPA to be significantly
correlated with student performance in accounting (Doran, Bouillon,
& Smith, 1991; Eskew & Faley, 1988; Jenkins, 1998), marketing
(Borde, 1998), and economics (Bellico, 1974; Cohn, 1972). However,
because the predictive impact of GPA in an information system course is
unknown, we believe it should also be included in the present research
model.
This study was conducted among 92 business school students enrolled
in four sections of an introductory information systems (IS) course at a
public university. All sections were taught by the same instructor,
under the same set of conditions. The use of information technology was
an integral part of the course requirement. To complete the course
successfully, students must complete two hands-on course projects, one
hands-on mid-term examination, and a traditional paper-and-pencil final
examination.
At the beginning of the course, we measured the students'
level of business computer literacy with an existing examination
instrument. This provided an individual numeric computer literacy score
(CLS), data for the primary independent variable of the study. The
instrument had been in use twice a year to determine if a student had
met the business school's computer literacy requirement. The exam
consisted of three parts: hardware and software concepts, word
processing, and spreadsheet modeling. The concepts part was administered
on paper in multiple-choices format, while the remaining two parts
involved hands-on word-processing and spreadsheet problems to be
completed on a computer. Over the course of five years since its first
use, the exam had produced a consistent passing rate of 30 to 35
percent. This suggests that the instrument is fairly reliable.
A student information database was used to collect data for the
other independent variables: gender, and student GPA prior to taking the
IS course. A multiple regression model was run on four performance
measures: project 1, project 2, mid-term exam, and final exam. Project 1
involved the development of a database application using a database
management system (DBMS), to which the students had no prior exposure.
This performance measure was designed to test Hypothesis 2.
Project 2 involved the development of a production plan using
spreadsheet modeling. The measure was used to test Hypothesis 1.
Students completed the two projects individually, outside the classroom.
The hands-on mid-term exam consisted of three parts: IS concept
questions answered with a word processor, a database problem, and a
spreadsheet problem. This measure was designed to test Hypotheses 1 and
2. The final exam consisted of entirely conceptual IS questions,
conducted in paper-and-pencil format. The exam questions focused on
traditional IS theories such as transaction processing, decision
support, and systems development, which can be considered
hardware/software-independent. The measure was used to test Hypothesis
3.
The general regression model was formulated as follows:
[Ps.sub.i] = [$.sub.0] + [$.sub.1]CLS + [$.sub.2]GEN + [$.sub.3]GPA
+ [sub., i]
Where: [Ps.sub.i] = numeric score on the two projects and the two
exams (100 possible on each),
CLS = numeric score on the computer literacy exam (100 possible),
GEN = subject's gender, coded 1 for male and 2 for female,
GPA = subject's cumulative grade point average prior to taking
the course (4.0 scale),
0 = intercept,
[$.sub.1-3] = slope coefficient, and
,i = error term.
RESULTS
Descriptive statistics for the various measures of independent and
dependent variables are presented in Table 1. The relatively large
standard deviation value for CLS suggests that there was a great degree
of variation among students' computer literacy levels.
Shown in Table 2 are simple pair-wise correlation coefficients
among the independent variables. We found that gender and CLS were
negatively correlated at the .05 probability level. This is not
surprising. As discussed earlier, prior studies suggest that males tend
to demonstrate a higher level of computer aptitude than females. We also
found GPA and CLS to be positively correlated; this is consistent with
the expectation that high-achieving students make greater efforts in
acquiring the necessary knowledge and skills, including computer
literacy.
The correlations found in Table 2 do not pose a serious
multicollinearity problem. The correlation coefficients were relatively
small. We also calculated the variance inflation factors (VIF) for each
of the variables. While a VIF considerably larger than 1 would be
indicative of serious multicollinearity problems, none of the VIF values
calculated for this study was greater than 1.10.
In Table 3, we report the results of the regression analysis. The
proposed model appeared to fit well in predicting performance for three
of the four performance measures: Project 2, the Midterm Exam, and the
Final Exam. Reported coefficients of determination (R2) were 0.21, 0.36,
and 0.27, while F values were 9.33, 18.30, and 12.18, respectively, all
at a significant 0.01 probability level.
Students' level of computer literacy, represented by CLS,
proved to be significant in predicting student performance on Project 2
(t = 2.79, p < 0.01) and the Mid-term Exam (t = 2.38, p < 0.05).
These results lend support to the hypothesis (H1) that students'
level of computer literacy may influence their task performance, if the
task involves the use of software familiar to them. We did not find CLS
to be predictive of student performance on Project 1. Therefore there
was no evidence to support the hypothesis (H2) that students might be
able to transfer their computer literacy knowledge to tasks involving
the use of an unfamiliar type of software. This also seems to explain
the difference between the t statistic for Project 2 and the t statistic
for the Mid-term Exam, since one part of the exam involved the use of
database software. Finally, we found no CLS impact on student
performance on the Final Exam. This result provides support for H3, that
students' level of computer literacy will have no effect on their
task performance, if the task requires neither conceptual nor
operational knowledge of a computer system.
Of the other two independent variables, we found GPA to be a
significant predictor of performance across all four dependent measures.
This is consistent with prior research asserting the validity of using
students' past GPA as a strong predictor of future academic
performance. To high academic achievers, an IS course, despite its heavy
technological content, does not appear more difficult than courses in
other subject areas.
Overall, gender was not a statistically significant performance
predictor for an IS course. Gender's significant effect on Project
2 performance is worth further exploration, however. Because Project 2
involved the use of a spreadsheet program, the result seems to raise an
interesting question: are females better at using a spreadsheet program
than males? More research is needed to confirm and explain this
observation.
DISCUSSION AND CONCLUSION
In this study we examined the performance impact of students'
computer literacy in an information systems course. Results of multiple
regression analysis found the predictive power of computer literacy to
be task-dependent. As expected, if a task requires substantial uses of
computers and a specific type of software, students with a higher level
of computer literacy, as measured by their proficiency in using that
specific combination of hardware/software, achieved significantly better
performance. Conversely, if the task requires the use of unfamiliar
software, or if the task requires neither the use of computers nor
conceptual knowledge of computer hardware/software, students' level
of computer literacy had no significant impact on their performance.
Instead, students' GPA appeared to have far more predictive power
for such tasks.
The study's failure to find support for H2 requires further
explanation. Project 1, designed to test H2, required students to
develop simple database applications. While the database software was
unfamiliar to the students, we had expected them to use their mental
models of a computer system to transfer existing knowledge to a new,
novel task. What affected their performance, however, did not appear to
be their operational knowledge of the software, but rather their
understanding of the database concepts. A closer look at their submitted
work revealed that many students had difficulty grasping fundamental
concepts such as relationships and Boolean logic in complex database
queries. While the graphical user interfaces in today's software
environment provide great operational consistency across different
applications, what seems to dictate task performance, as implied by this
study, is an understanding of the task itself. Computers are only tools.
A poor understanding of the task will lead to an ineffective use of
tools.
The research reported here is limited by several factors. The
participants were all from courses taught by one instructor at one
university. In the absence of a standard test instrument, the computer
literacy examination used in the study was a choice of convenience.
While extraneous factors such as instructor styles and task designs were
controlled, the results of the study may not be generalizable to
different institutions. Further research is needed to overcome these
limitations. First, a standard evaluation instrument for computer
literacy must be developed and validated. Second, the current study
needs to be replicated under different settings.
The findings from this study have important implications on
teaching and learning. Despite the universal requirements for computer
literacy among academic institutions, the performance impact of such
requirements is far from clear-cut. The relationship between a
technology-induced increase in students' personal productivity and
their academic performance appears, at best, indirect. The predictive
power of computer literacy on performance does not depend on whether and
how much information technology is used in the completion of a course.
Rather, it depends on how much that technology is integrated into the
evaluation of student performance. Unless we can demonstrate a clear
relationship between computer literacy and performance, many students
will continue to perceive the requirement as an unnecessary roadblock to
their progress in their chosen academic programs.
Future research also needs to explore the linkage between computer
literacy and personal productivity, and the linkage between productivity
and academic performance. Establishing the predictive validity of
computer literacy requirement is paramount. As information technology is
further integrated into the educational process, we need evidence of
such predictive validity to help influence students' attitude and
behavior toward fulfilling the requirement. Meanwhile, this would also
heighten the need for adequate student training in the uses of
information technology, if we expect them to be successful.
REFERENCES
Anderson, A.A. (1996). Predictors of computer anxiety and
performance in information Systems. Computers in Human Behavior, (12)1,
61-77,
Bellico, R. (1974). Student attitudes and undergraduate achievement
for economics majors. Journal of Economic Education, (5)2, 67-68.
Booth, P. (1989). An Introduction to Human-Computer Interaction,
Hillsdale, NJ: Lawrence Erlbaum Associates.
Borde, S.F. (1998). Predictors of student academic performance in
the introductory marketing course. Journal of Education for Business,
73(5), 302-306.
Borde, S.F., Byrd, A.K., & Modani, N.K. (1996). Determinants of
student performance in introductory corporate finance courses, Presented
at the Southern Finance Association (SFA) Annual Meeting, Key West,
Florida.
Bozionelos, N. (1996). Psychology of computer use: XXXIX.
Prevalence of computer anxiety in British managers and professionals.
Psychological Reports (78), 995-1002.
Cohn, E. (1972). Students' characteristics and performance in
economic statistics. Journal of Economic Education, (3)1, 106-111.
Doran, B. M., Bouillon, M. L., & Smith, C. G. (1991).
Determinants of student performance in Accounting Principles I and II.
Issues in Accounting Education, (6)1, 74-84.
Eskew, R. K., & Faley, R. H. (1988). Some determinants of
student performance in the first college-level financial accounting
course. The Accounting Review, (63)1, 137-147.
Foss, D.J., & DeRidder, M. (1988). Technology transfer: On
learning a new computer-based system. In J.M. Carroll (Eds.),
Interfacing Thought (pp. 159-183), Cambridge, MA: The MIT Press.
Goodhue, D.L., & Thompson, R.L. (1995). Task-technology fit and
individual performance. MIS Quarterly, (19)2, 213-236.
Heath, J.A. (1989). Factors affecting student learning - An
econometric model of the role of gender in economic education, Journal
of Economic Education, (20)2, 226-230.
Igbaria, M., & Chakrabarti, A. (1990). Computer anxiety and
attitudes toward microcomputer use. Behavior and Information Technology,
(9)3, 220-241.
Jaderstrom, S. (1995). Technology's impact on computer and
business curricula. In N.J. Groneman & K.C. Kaser (Eds.), Technology
in the classroom (pp. 1-9). National Business Education Yearbook, No.
33. Reston, VA: National Business Education Association.
Jenkins, E.K. (1998). The significant role of critical thinking in
predicting auditing students' performance. Journal of Education for
Business, (73)5, 274-279.
Jones, M. C., & Pearson, R.A. (1996). Developing an instrument
to measure computer literacy.
Journal of Research on Computing in Education, (29)1, 17-28.
Lipe, M.G. (1989). Further evidence on the performance of female
versus male accounting students. Issues in Accounting Education, (4)1,
144-152.
Lonstreet, W.S., & Sorant, P.E. (1985). Computer literacy -
Definition? Education Horizons, (63)3, 117-120.
Mutchler, J.F., Turner, J.H., & Williams, D.D. (1987). The
performance of female versus male accounting students. Issues in
Accounting Education, (2)1, 103-111.
Rhodes, L.A. (1985). Moving beyond computer literacy. Educational
Leadership, (42)5, 88-89.
Rochester, J. & Rochester, J. (1991). Computer for People:
Concepts and Applications. Homewood, IL: Irwin.
Rozell, E.J., & Gardner, W.L. (1999). Computer-related success
and failure: A longitudinal field study of the factors influencing
computer-related Performance. Computers in Human Behavior, (15)1, 1-10.
Sein, M.K., & Bostrom, R.P. (1989). An experimental
investigation of the role and nature of mental models in the learning of
desktop systems. In K.M. Kaiser & H.J. Oppelland (Eds.), Desktop
Information Technology, North Holland, Elsevier Science Publishers.
Smith, B.N., & Necessary, J.R. (1996). Assessing the computer
literacy of undergraduate college students. Journal of Education,
(117)2, 188-193.
Tanyel, F., Mitchell, M. A., & McAlum, H. (1999). The Skill set
for success of new business school graduates: Do prospective employers
and university faculty agree? Journal of Education for Business, (75)1,
33-37.
Thompson, R.L., Higgins, C.A., & Howell, J.M. (1994). Influence
of experience on personal computer utilization: Testing a conceptual
model. Journal of Management Information Systems, (11)1, 167-187.
Trauth, E.M., Farwell, D.W., & Lee, D. (1993). The IS
expectation gap: Industry expectations versus academic preparation. MIS
Quarterly, 17(3), 293-307.
Weller, H., Repman, J., Lan, W. & Rooze, G. (1995). Improving
the effectiveness of learning through hypermedia-based instruction: the
Importance of learner characteristics. Computers in Human Behavior,
11(3-4), 451-465.
Williams, S.W., Ogletree, S.M., Woodburn, W., & Raffeld, P.
(1993). Gender Roles, Computer Attitudes, and Dyadic Computer
Interaction Performance in College Students. Sex Roles, (29)7-8,
515-525.
Winter, S.J., Chudoba, K.M., & Gutek, B.A. (1997). Misplaced resources? Factors associated with computer literacy among end-users.
Information & Management, (32), 29-42.
Zhao, J. J., Ray, C. M., Dye, L. J., & David, R. (1998).
Recommended computer end-user skills for business students by Inc. 500
executives and office systems educators. OSRA Systems Research Journal,
16(1), 1-8.
L. Richard Ye, California State University, Northridge
TABLE 1. Descriptive Statistics
Variable M SD N
Dependent Variables
Project 1 score 67.40 23.68 92
Project 2 score 80.67 15.11 92
Mid-term Exam score 56.43 11.25 92
Final Exam score 55.49 11.31 92
Independent Variables
CLS 62.15 21.16 92
Gender 1.51 0.50 92 *
GPA 2.67 0.48 92
* 45 males and 47 females
TABLE 2. Simple Correlation Coefficients
Variable CLS Gender GPA
CLS 1.00 -.24 * .24 *
Gender 1.00 .10
GPA 1.00
* p < 0.05.
TABLE 3
Results for Regression Models for Various Performance Scores:
Standardized Beta Coefficients (t Statistics) for Independent
Variables
Variable Project 1 Project 2
CLS -0.12 0.28
(-1.05) (2.79) **
Gender 0.05 0.27
(0.42) (2.66) **
GPA 0.32 0.20
(2.96) ** (2.07) *
Intercept 30.92 34.83
(1.88} (3.71) **
Model statistics
Adj. [R.sup.2] 0.07 0.21
F value 3.33 * 9.33 **
Variable Mid-term Exam Final Exam
CLS 0.24 0.12
(2.38) * (-1.23)
Gender -0.09 -0.04
(-1.04) (-0.47)
GPA 0.53 0.50
(6.03) ** (5.36) **
Intercept 19.50 21.52
(3.13) ** (3.20) **
Model statistics
Adj. [R.sup.2] 0.36 0.27
F value 18.30 ** 12.18 **
* p < 0.05.
** p < 0.01.