Fact or fiction? a study of managerial perceptions applied to an analysis of organizational security risk.
Taylor, Richard G. ; Brice, Jeff, Jr.
INTRODUCTION
Perception may be defined as the process by which people translate
external cues into a rational and integrated idea of the world around
them (Lindsay & Norman, 1977). Even though these impressions are
often based on deficient and/or unreliable information, perception is
commonly accepted by initiating actors as reality and serves to direct
human actions in general (Daniels, 2003). Business decisions are based,
largely, on managerial perceptions. To this end, managers have been
theorized to draw conclusions based on their inaccurate perceptions as
opposed to critical review of all available environmental information
(Starbuck & Mezias, 1996).
Although the situation described above seems ripe for further
investigation, it is apparent that few management scholars have studied
managerial perceptions over the past several decades (Mezias &
Starbuck, 2003). This apparent chasm in management knowledge has been
attributed to a variety of reasons to include the difficulty of
designing studies and instruments that accurately measure managerial
perceptions (Starbuck & Mezias, 1996), the lack of interest in
businesses supporting this type of investigation (Mezias & Starbuck,
2003), and a dispute in the value of this type of research among
scholars (Das, 2003). Notably, one biting criticism of the study of
managerial perceptions is that academicians themselves may harbor
disconfirming preconceived notions (biases) about study subjects
(managers) which causes them to interpret study results through a lens
of criticism (Das, 2003). The conundrum has been aptly described as
management researchers not being familiar enough with the actual
practice of management to understand the subtle nuances of the craft.
Accordingly, scholars have been directed to "engage in research
only after they had acquired some semblance of the managerial
world" (Das, 2003). Workable familiarity notwithstanding, the end
result has been described as the construction of studies that are deemed
to hold little utility for scholarly examination and even less for
managerial practice (Das, 2003). In response, Mezias and Starbuck (2003)
famously answered such criticisms and challenged the community of
management scholars to "join the odyssey" in the pursuit of
more definitive studies of managerial perceptions to address significant
research and practice inquiries.
From this starting point, the research that we present here is
designed to address the "lack of relevance" issue in the study
of management perceptions. As with most studies of managerial
perceptions, the challenge is to ascertain managers' perceptions
about particular issues of consequence to practice and research, and
then to contrast these cognitive beliefs with objective evidence of the
situations to demonstrate either the congruence of held perceptions with
reality or to highlight significant differences between what is believed
and what is actually occurring. Since managers make decisions based on
their perceptions of environmental stimuli, this type of investigation
is useful to understand the rationale behind how managers formulate
strategy, develop contingencies, and otherwise react to ever-changing
market conditions.
One such important topic that managers should constantly reconsider
is organization security. The value of an organization includes
intellectual property, specialized processes, proprietary knowledge, and
other information that must be protected from external competition and
industrial thievery. To prevent breaches of security, management needs
to understand and identify the vulnerabilities that exist within the
organization, and then implement decisions that correct or eliminate
those vulnerabilities (Rosenthal, 2003). Knowing the cause of security
risks is vital to the development of an overall business security
strategy (Whitman, 2003). In order for managers to do this well, there
needs to be a clear assessment and understanding of technology-based
threats, employee (behavioral) threats, and an accurate judgment of the
firms' state of overall security. Therefore, in order for
organizational value to be preserved (and possibly enhanced), managerial
perceptions of organizational security risks need to be based in reality
and not fiction.
Accordingly, we apply a qualitative research design to explore a
genuine situation in the management of an existing financial institution
to discover whether the perceptions of organizational risk and
information security held by management reflects the reality of
organizational operations or if they represent simplified cognitive
biases toward a desired end state. Not only does this type of
investigation hold obvious import for the actors involved but it clearly
brings researchers closer to the milieu of managerial thinking and
practices. Specifically, in the following section we describe the effect
that manager's perceptions may have on organizational behavior and
performance. Subsequently, we develop hypotheses and describe our
empirical study. Last, we submit our findings and discuss their
relevance for management practice and research.
LITERATURE REVIEW
Managerial Perception and Organizational Performance
Managers receive information cues from the environment and from
within their own firm. However, this information is filtered by managers
through the lens of their own perceptions (Mezias & Starbuck, 2003).
Since every individual possesses unique cognitive processes and biases,
no two individuals interpret the information cues in the very same
manner. Therefore, managerial perceptions tend to be unique to each
individual interpreting the cues (Fahey & Narayanan, 1989). This
means that no two managers will draw exactly the same conclusions when
presented with the same informational stimuli. For instance, previous
research has demonstrated that various managers have difficulty
concurring on the assignment of their firm's undertakings to
operational categories (such as SIC Codes) because their cognitive
biases have a tendency to be more exact (Boland et al., 2001). In other
words, organizational activity may be initiated not based on real events
or objective information; but, on the filtered perceptions of the
particular manager highest in authority in relation to the situation to
be rectified.
The emergence of cognitive biases in managers' development of
perception is unavoidably associated to the concept of bounded
rationality (Simon, 1957). The theory of bounded rationality states that
decision makers construct simplified models to deal with the world.
Simon argues that the decision maker "...behaves rationally with
respect to this [simplified] model, and such behavior is not even
approximately optimal with respect to the real world. To predict his
behavior, we must understand the way in which this simplified model is
constructed, and its construction will certainly be related to his
psychological properties as a perceiving, thinking and learning
animal" (p. 198). Hence, scholars must consider the constitution of
the cognitive schema of managers if we are to ascertain the rationale
behind organizational strategy and behavior.
While the limitations of a manager's schema and the inherent
cognitive bias toward the information residing within it suggest that
some organizational behavior may be based on erroneous managerial
perceptions, few studies have been performed to investigate this
possibility (Mezias & Starbuck, 2003). This is likely due to the
difficulty of comparing managers' perceptions with those of
external observers (Santos & Garcia, 2006). However, some
researchers have studied the implied effects of cognitive bias on
managerial decisions and, ultimately, organizational performance.
Managers who make decisions in the fog of misleading perceptions have
been theorized to initiate flawed organizational strategy (Bourgeois,
1985; Boyd, et al, 1993), misinterpret strategic representations (Barr
& Huff, 1997; Boland et al, 2001), and misperceive environmental
risks and industry threats (Barr et al, 1992; Starbuck, 1992). In short,
managers who make decisions by relying on simplified mental models are
likely to damage organizational performance. Therefore, research that
measures the accuracy of managerial perceptions, and evaluates
repercussions resulting from organizational activity initiated because
of those perceptions, is useful to help elucidate fluctuations in
organizational success.
Managerial Perception and Organizational Security Risks
Cyert & March's (1963) behavioral theory of the firm shows
that management decisions are limited by bounded rationality. Bounded
rationality proposes the view that managers endeavor to be rational but
must cope with acute restrictions on their capacity to process
information (Simon, 1956; Bromiley & Euske, 1986). They further
indicate that managers make decisions without the availability of all
necessary information. Starr (1969) and Slovic (1987) have utilized the
theory of bounded rationality to research the effects of perceptions of
risk. They have concluded that management is blind to the high risks
posed by technology because of the perceived benefits of it. Their
results indicate that rational decision making is not always used
"when judging probabilities, making predictions, or attempting to
cope with probabilistic tasks" (Slovic, 1987, p. 281). Instead,
people tend to use judgmental heuristics or simplification strategies.
The heuristics may sometimes be valid in certain circumstances, but in
others they can lead to biases that are "large, persistent and
serious in their implications for decision making" (Slovic et al.,
1976 p. 36).
The psychological dimensions of risk can be distilled into two
primary factors: (1) the severity of the consequence, and (2) the
probability of its occurrence (Slovic et al, 1976). Initial research on
perceived risks focused on hazards to include as earthquakes, nuclear
power, and food preservatives. However other research has taken these
theories and methods and applied them to more specific subjects such as
the perception of risks toward using seat-belts (Slovic et al., 1978),
adolescents' perceptions of risks from smoking (Slovic, 1998), and
the risks of using a mobile phone while driving (White et al., 2004).
This paper will use the perception of risk theory to investigate
manager's perceptions of organizational security threats.
The two factors mentioned above can be applied to management's
perception of employee behavior that may expose the organization to
security risks. In consideration of the first factor, it is suggested
that managers are primarily concerned with the severity of the
consequences from technology-based threats and will not be concerned, or
may not contemplate, unintentional threats posed by employee behavior.
Regarding the second factor, management will consider organizational
vulnerability based on employee actions to be rare occurrences,
therefore posing little risk and needing no attention.
A long stream of research indicates that management's
perceptions of risks have a direct impact on the decisions they make
(Starr, 1969; Slovic, 1987; Slovic et al., 1974, 1976, 1979; Fischoff et
al., 1978, 1979; Sjoberg, 1999; Siegrist et al., 2005) Applying this
theory to organizational risk behaviors, the research indicates that
management's perceptions of various organization security risks
leads to strategic decisions that often result in inadequate safety
measures. This has been referred to as "executive blindness"
(Slovic, 1987). Key decision makers tend to misperceive events by
ignoring probabilities and instead using heuristic-based mechanisms to
measure uncertainty or avoid dealing with it. This leads to a reactive
approach to organizational security threats (Slovic et al., 1974).
Misperception can lead to an "optimistic bias"
(Helweg-Larsen and Shepperd, 2001) by top management. The optimistic
bias refers to the tendency for people to believe that they are more
likely to experience positive events and less likely to experience
negative events. This has been demonstrated in events such as auto
accidents (McKenna, 1993), earthquakes (Helweg-Larsen, 1999), and crime
(Preloff & Fetzer, 1986). The optimistic bias can be defined at both
the individual level and group level, and results in people rating their
own risks as lower than other similar people or groups (Helweg-Larsen
and Shepperd, 2001). People perceive that they are better prepared to
deal with negative events than are others. This leads to the first
hypothesis:
H1: Managers perceive the level of organization security within
their firm to be high.
The optimistic biases may also interfere with the institution of
preventive measures to address risks (Helweg-Larsen and Shepperd, 2001),
such as organizational security policies and security awareness
training. Managers may feel overly optimistic regarding their
employees' awareness of organization security policies. This type
of optimism can also be attributed to manager's perception that the
organization's employees are part of a homogenous
"in-group" whose behavior is based on a positive exemplar,
which is often the manager themselves (Judd & Park, 1988).
Therefore, since they read and adhere to organization security policies,
they perceive that their employees will do the same. The leads to the
second hypothesis:
H2: Managers perceive that employees adhere to established
organization security policies.
When making decisions, managers do not always have sufficient
knowledge regarding threats to their organizational information. The
manager will use all information and experiences that are readily
available; however this often leads to a technology-based approach to
addressing organization security risks (Dhillon, 2001). Managers will
often turn to trusted advisors whose opinions they value (Siegrist &
Cvetkovich, 2000). These advisors are often so-called experts who share
the same values that the manager believes are important in a specific
situation (Earle & Cvetkovich, 1995). For example, when information
security advice is sought, management typically turns to the Information
Security Officer (ISO) or the Information Technology (IT) manager. This
advice, though valuable in protecting the organizational from security
breaches, is often mostly technological in nature. If management relies
solely on this advice, the human element of organization security
management may be entirely ignored. To make more informed decisions,
managers must increase their awareness of their environment (Gigerenzer,
2001). This leads to the third hypothesis:
H3: Managers fail to perceive of routine employee actions that may
unintentionally expose the organization to security risks.
RESEARCH METHOD
A qualitative understanding of organization security can be
advantageous because of the ability to research the phenomenon within
the scope of the organization (Dhillon & Backhouse, 2001). Yin
(2003) states "the distinctive need for case studies arises out of
the desire to understand complex social phenomena" (p. 2).
Therefore the case study is an ideal methodology for investigating the
concerns of organizational security, allowing an in-depth investigation
of a phenomenon in its original context (Benbasat et al., 1987).The case
study methodology can be used both for the development of theory and for
the testing of existing theory (Yin, 2003). Case studies may be
exploratory, explanatory, or descriptive in nature and are especially
effective when answering "how" and "what" questions
that are exploratory in nature (Yin, 2003), which is the purpose of this
study.
Construct validity was satisfied through data triangulation and
having a draft of this research reviewed by a key informant (Yin, 2003).
The multiple sources of data used in this case study are (1) interviews,
(2) documents, and (3) direct observation. Because of the primary
researcher's past experience within the local financial industry,
and, by having established a positive rapport with the Chief Executive
Officer (CEO) and Chief Information Officer (CIO), full access to all
employees, all documents, and office locations was granted. Twenty-four
(24) employees were interviewed from all levels of the organization.
Documents, including emails, policies, examination reports, and the
employee handbook were made available. In addition, access was given to
observe employee behavior during office hours, and to explore the
offices after hours. After-hours access allowed the researcher to roam
the organization to look for evidence of organization security risks.
Permission was also granted to "dumpster dive" in an attempt
to locate sensitive information that may have been thrown in the trash
receptacles (instead of shredder bins).
To ensure reliability, the case study protocol proposed by Yin
(2003) was followed. Employee interviews were conducted over a six month
period. The interviews were focused, lasting about an hour each (Merton
et al., 1990). The questions were semi-structured but allowed for open
responses and discussion from the interviewees. The questions attempted
to gain understanding from the employees regarding the following:
1) Their understanding of organization and information security,
2) Their perception of the information security level at their
organization,
3) Their understanding of organization security risks and the
behavior that causes information security risks,
4) Their knowledge of consequences to the organization for security
risks,
5) Their specific behavior or actions that have resulted in
organization security risks,
6) The level of trust the employees have in their peers,
7) The extent to which countermeasures are implemented within the
organization to prevent security risks.
To further add to the reliability, a case study database was
created which allowed the data to be analyzed by someone other than the
researchers (Yin, 2003). The interviews were recorded on tape, and then
transcribed indicating the date of each. (Miles & Huberman, 1994).
The interview transcripts were combined with written documentation, and
personal observations to form the entire case study database.
External validity involves the extent to which the results of this
study can be generalized (Yin, 2003). The use of theory in a single-case
study contributes to the external validity. The following case study is
a theory-based interpretation of managerial perceptions that may
unintentionally lead to security risks in a single organization.
External validity could be improved with studies in other organizations
to corroborate the findings of this study (Yin, 2003).
Case study research also requires a high degree of ethical
consideration (Roth, 2005), especially when the research involves a
subject such as organization security. In any research, ethical
considerations must remain on ongoing part of the research, well beyond
the initial signing of a consent form (Malone, 2003). The CEO and CIO of
the organization studied served as "gate-keepers" who allowed
access to the organization and employees (Miller and Bell, 2002). It was
important to keep these two individuals updated on a constant basis.
Each staff interview was conducted with only the primary researcher and
the subject employee present. Document review was conducted by the
researcher alone after the documents were provided by the CIO. All other
events, (dumpster diving and after-hours observations) were conducted
with the CIO present. After each phase of the research, the CEO was
briefed on the findings, and additional consent was sought (and granted)
for the proceeding phase of the research.
The Organization and Analysis
First South Savings (FSS) is a pseudonym for an existing financial
institution located in a major metropolitan area in the southern United
States. There are seven FSS branches throughout the metropolitan area,
consisting of approximately 200 full and part-time employees. Of the
seven branches, one branch is located at the FSS headquarters. At this
location are the executive offices, the information technology (IT)
department, accounting, credit card services, wire transfers, and other
back-office and support services. This organization was chosen for
several important reasons. First, one of the authors served as an
executive in this industry for over 10 years before entering the
academic community, therefore providing additional insight into the
organizational environment and the issues facing the industry. A
continued involvement in the industry through speaking at various
industry conferences and educational sessions, as well as publishing
articles in the industry's journals, has established the author as
an industry insider. Being considered an industry insider provided a
high level of legitimacy with the FSS staff, resulting in
employees' willingness to divulge information and greater access to
organizational resources (Malone, 2003).
Second, FSS was chosen because it is in the financial services
industry. This industry has been shown to be an important factor to be
considered when conducting organization and information security
research because of the nature of the assets to be protected (Straub
& Nance, 1990). The financial industry deals with a greater amount
of sensitive and potentially damaging information than other industries.
Specifically, banking institutions have a lot at stake when it comes to
organization security. Therefore, employee behavior that unintentionally
(or intentionally) leads to security risks could have greater
consequences. Employees in industries with high degrees of information
sensitivity should be more concerned about security (Goodhue &
Straub, 1991). Because of the potential for information loss, the
financial industry faces strict regulatory requirements regarding the
protection of information. The Graham-Leach-Bliley Act (GLBA) was
instituted in 1999 to protect financial information. GLBA requires all
financial institutions to secure customer data from unauthorized access
(SBC, 1999). Financial institutions also face regular federal
examinations to ensure regulatory compliance. In the past few years,
information security has been included in those examinations. Based on
the requirements posed on FSS by GLBA and federal regulators, it was
expected that FSS would consider organization security a high priority.
This leads to the first hypothesis to be tested:
H1: Managers perceive the level of organization security within
their firm to be high.
During the initial interview with the CEO and subsequent interviews
with other executives of FSS, it was confirmed that they perceived that
organization security was a top priority and that FSS had a
significantly high level of security. The first question asked of each
executive staff member was "how secure is your organization's
information"? According to the CEO:
On a scale of one to ten I would say we're an eight. We have a lot
of in-house expertise and I think we have devoted a lot of
resources trying to provide good security. I think that we have had
pretty good performance down the line; however that's more
intuitive that data based.
Time and time again, similar answers were echoed, such as the
following made by the Chief Financial Officer (CFO):
I believe our information security is solid. My opinion is not
based on our IT department, but based on what the so-called experts
have told me. That's where my decision is coming. Not that I have
any concerns with our IT department, but if I hear it from an
expert what else am I to believe.
FSS invested in yearly security audits from an outside firm to
ensure the organization was providing adequate security. These outside
security audits seemed to give the organization a high level of
confidence in their organizational security. According to the VP of
Operations:
We have the outside audit firm come in and hack around and whatever
they do, then come back and I sat in exit interviews where they say
they found some places we need to improve on...some of it is
non-critical and some is a little more critical. Based on what I've
seen and from what people have come up and told us I feel pretty
good.
The CEO, as well as the other executives, perceived that the
outside security audits and the federal examinations served more as a
validation of their existing safety measures than a service that
improved organizational security.
I think what a third-party does is either validate or invalidate
your intuitive feelings about where you are. So I think that in the
scope of how the world is sitting in this area I was pleased with
what the third-party said. We also had an examination last year
with regulators coming in. Again that was confirmation that
compared to others in the industry we are in pretty good shape.
Predictably, reliance on the results of the third-party audit has
led to a false sense of security by the executive staff. This reflects a
typical technology-based approach to organization security. The
objective listed in the audit report was as follows:
"We understand that the primary objectives of the project were
to perform a comprehensive IS controls review of the automated controls
within the existing computer environment and systems. In addition, we
performed a firewall security review and internal and external intrusion
testing. Our findings and recommendations will be useful in enhancing
the systems and in providing cost-effective internal control
improvements."
Clearly, from reading the objective, it can be determined that the
security audit consisted of an assessment of the technological controls
of organizational security. However, when the report was presented to
the executive staff informing them that no security vulnerabilities were
discovered, management was convinced that organizational information was
well protected from security threats.
Even without considering the results of the outside audit, managers
felt the organization was secure based on their experience with the
controls that existed on the system, again reflecting a technology-based
view of organization security.
I'm comfortable with the system being secure. You have to have a
password to get into everything, so whether it be just the email or
the system itself ... you have a user name and a password in order
to get into it, so I'm fairly comfortable with security.
Even after discussions to distinguish "systems security"
from "information security", management's perception of
the security level of the organization was unchanged.
I think it's pretty high, especially when it comes to customer
data, that level of awareness has been raised over the last year,
year and a half. People are reminded of it on a regular basis. Is
it 100% effective? .., no, nothing is 100% effective. But I think
the general awareness of protecting customer information is fairly
high.
Management did not perceive information security to be a problem,
whether the information is in the form of electronic data within the
computer-based system or hard-copy data. Some managers referred to the
results of the recent federal regulatory examination, which included
ensuring the protection of customer information. After reviewing the
examination report and speaking to the CIO, it was discovered that the
information security section of the examination consisted of a
self-report questionnaire that was completed by the CIO. When questions
regarding information security were asked, such as "has management
established and documented an adequate information security policy to
provide for the overall direction and implementation of information
security", the CIO simply answered "yes". Presumably
because of a lack of expertise, the federal examiners who conducted the
examination at FSS simply accepted the answers provided by the CIO. As a
result, the examiners reported that information security at FSS was
"satisfactory". This contributed to the overall perception of
management that the organization was successfully providing adequate
security.
The consensus of the executive staff was that FSS considered
organization security a top priority and the organization has taken
appropriate measure to ensure the protection of their information. The
primary reason given for management's perception was the belief
that FSS had sufficient policies in place to address security issues. A
review of FSS employee handbook found policies addressing physical
control to the buildings and specific areas within the branches,
password creation and protection, information securing, and the disposal
of customer and organizational information. Management perceived that
employees were reading and following these policies. If organizational
information was put at risk, they perceived it would be the result of an
outside attack or an internal employee's deviant behavior. Employee
behavior that would unintentionally put the organization at risk was
perceived to happen on rare occasions; however not often enough to be
concerned about. This leads to the second hypothesis:
H2: Managers perceive that employees adhere to established
organization security policies.
Organization security models have stressed the importance of the
establishment and implementation of security policies (Segev et al.,
1998). Security policies at FSS were posted on the company intranet and
updated throughout the year as needed. All employees had access to the
intranet and were encouraged to read the policies. Once a year,
employees were asked to sign a document verifying that they had read the
policies. Management was asked about their perception of three areas of
employee behavior which were addressed in their organization security
policies, which could unintentionally lead to security risks: (1)
revealing/sharing system passwords, (2) leaving sensitive information
unsecured, and (3) throwing sensitive information in the trash.
Management perceived that these were not problems for FSS because of
existing policies that prohibit such actions. When asked if they thought
employees would give out or share their system password, management
unanimously proclaimed that employees would not. Managers felt that
employees were well aware of existing password policies at FSS, and
fully understood the importance of protecting system passwords.
I wouldn't sit here and tell you that it would be 100%, depending
on who was asking some people would probably offer it up, but
overall most would not.
Regarding leaving sensitive information unsecured, managers again
felt that this rarely occurred at FSS. Each office had a lock on the
door and every desk and file cabinet had locks. All employees were given
keys to the locks for their work areas. The securing of information, as
required by GLBA, had been stressed to all employees.
Employees understand the importance of securing information. Even
those who have locks on their doors know they have to put
information away at night and lock it in their desks or file
cabinets because the cleaning crew still comes in and empties the
trash. The information we have here is just too sensitive to leave
out in the open so we make it a top priority to see that it doesn't
happen. Graham-Leach-Bliley really opened our eyes to protecting
the privacy of our customer's information.
Management also perceived that the FSS staff was quite effective at
shredding sensitive information.
Anything dealing with customers' accounts goes to that shred bin
and it's kept locked up in a back room with the door shut and the
cleaning people don't go into. We are pretty good about putting
things in shredder bins. Could I 100% say there is nothing in there
[the trash], but all in all the chances of it happening are very
slim.
Managers again pointed to the existence of a policy that stressed
the importance of shredding sensitive information. They also pointed out
that the number of shredder bins that were located throughout FSS made
it very convenient for employees to use. To add to the convenience, each
employee had an individual "shred can" at their desk in
addition to their trash can. Employees were encouraged to keep the two
receptacles separate to prevent accidentally throwing sensitive
information into the trash can. A tour of the shredder bins located
throughout the main branch was conducted. The shredder bins were large
plastic garbage-can-like receptacles. Each bin had a large slit in the
top approximately two feet long and five inches wide to allow employees
to put large quantities of paper in at one time or to facilitate thicker
green-bar reports. Each shredder bin was secured with a pad lock which
required a key to unlock. The shredder bins did seem to be located in
areas which made access easy for most employees. The "shred
cans" at employees' desks were observed. These "shred
cans" were clearly marked and kept very convenient for employee
use.
I see them taking their shred each day over there [to the shred
bins], but if they had something in the trashcan I don't go and
check everybody's trashcan at night. Now on the teller line all of
their shred is right there at their feet as they are working
throughout the day and if they want to throw something in the trash
it's back on the opposite side of the wall where they would
actually have to get up out of their chair and I don't think they
would get up to throw something in that trash can other than their
trash.
Because of the existence of policies that had been established to
protect organizational information, management perceived that these
policies were being followed by the staff. It was also perceived that
department supervisors were effective in the enforcement of these
policies. To verify the accuracy of management's perceptions,
employees were interviewed. Because of their close proximity, employees
have been shown to be the best source for understanding the behavior and
actions of other peers (Murphy & Cleveland, 1991). Behavior that is
observed by employees is different that those observed by management,
because employees have opportunities to see diverse and disparate
behaviors of which managers may not be aware. In addition to interviews,
employee behavior was observed to help test the final hypothesis:
H3: Managers fail to perceive of routine employee actions that may
unintentionally expose the organization to security risks.
System passwords
The first attempt to validate or invalidate the perceptions of
management was through the interview process. The interview process
began with six employees of the IT department. These employees ranged in
length of employment at FSS from two weeks to several years. After
reviewing the results of the outside security audit, expectations were
quite high that the IT department was making every attempt to keep the
organization's information secure (from a technological point of
view). Throughout the interviews, it was validated that the employees of
the IT department seemed effective in providing technology-based
solutions to keep information secure. Interview results showed that the
employees of the IT department would not share their personal passwords
with other employees. However, interviews revealed that employees of the
IT department were sharing administrator passwords. Each of the IT
employees had an individual network ID with administrator access;
however when logging on to perform networking functions, employees would
use the generic "Administrator" ID and password, which all IT
employees had knowledge of. By doing this, there was no record of which
IT employee was accessing the system, making it possible for information
to be put at risk without the possibility of detecting which employee
caused the risk.
Password sharing was also found throughout FSS, including all
branches visited. Two staff employees and one branch manager admitted
they would give out their passwords if asked by people they trusted,
such as management or IT personnel. One actually admitted she has
allowed another employee to user her ID and password. Another example of
password sharing is said to be common at FSS.
We had a situation where our computers went down and all of our
work had to be hand written, so when the computer system came back
up everything had to be logged in and so by giving my password to
the branch manager and assistant branch manager they were able to
go in and help post my work and other tellers' so we were just up
here until 9pm instead of midnight. I can't remember if I changed
my password the next day, but probably not, and I doubt that the
other tellers did either.
Access to these system passwords would allow someone to perform
financial transactions using another tellers identification (ID),
preventing the detection of the act and identification of the abuser.
Even though most employees claimed that they would not reveal their
system password, it was believed by the CIO (and from the primary
researcher's past experiences in the industry) that employees were
simply stating what they felt was the "appropriate answer".
The CIO suggested that this should be verified by having his IT staff
call employees and ask for their password. The IT staff contacted 60
employees randomly selected from the employee call list, being sure to
select employees from every level of FSS, including executives. The
employees were simply asked for their system password. Of the 60 calls
made, 10 went directly to voice mail, therefore eliminating those
employees from consideration. Of the 50 employees that were contacted,
50 passwords were surrendered, with only one employee providing any
objection before eventually surrendering his password. All employees who
surrendered their passwords had their passwords automatically reset,
forcing them to change it immediately.
In follow-ups, many employees said that even though they knew it
was against FSS policy to give out their password, they thought it was
okay because they believed that IT personnel could access their password
anytime through the system. Others claimed that they trusted the IT
staff, believing that their password would not be used for any
fraudulent activities. There was unanimous agreement among the employees
who surrendered their passwords--there was no intention to put
FSS's information at risk. (Note: To prevent any negative effect on
employee morale, after the analysis an email was sent to all employees
stating that the organization had failed to properly educate employees
about password procedures; thereby, eliminating any apparent blame on
the employees for their actions.)
The other two employee actions investigated were (1) throwing
sensitive information in the trash and (2) leaving sensitive information
unsecured. Both of these actions involve the protection of information
that is no longer only contained within the computer-based systems, but
now also resides in the forms of printed reports, customer receipts,
loan applications, and other confidential information. To verify these
other two actions that could unintentionally put the organization at
risk, interviews were conducted and it was arranged to have access to
the FSS headquarters and main branch after hours. The intent was to
observe the existence of unsecured information and to perform
"dumpster diving" to see if any sensitive information had
found its way into the trash. The CIO was ensured that this entire
process would take no more than one hour and supervised the researcher
during the investigation.
Discarded information
Employees were asked about throwing sensitive information in the
trash. All employees interviewed proclaimed that this did not happen at
FSS. They stated there were strict rules about using the shredder bins
and they were adamant that those rules were strictly followed. An
after-hours "dumpster diving" expedition contradicted the
employee interviews. At the first stop, the marketing department, a
discarded list containing the names and telephone numbers of the senior
management and the Board of Directors of FSS was found. A stop in the
office of the accounting manager resulted in finding documents
containing FSS employee information, including employee name, FSS
account number, and employee social security number. Also in the trash
were copies of the accounting manager's personal checks, as well as
confidential documents that had been manually torn, but were easily
pieced together to identify names and account numbers. The next dumpster
diving destination was the teller line, where several customer receipts,
each containing customer name, account number, and account balance were
found. The trash was also checked in a community printer room in the
branch area. Inside this trash can were several completed loan
applications and other documents that had been printed and discarded.
An interesting observation was also made regarding the "shred
cans" that were at employees' desks. Many employees do not
empty their "shred cans" into the shredder bins at the end of
the day, instead choosing to wait until the "shred can" fills
up. During the after-hours observation, the cleaning crew was observed
emptying the "shred cans" into the trash, unbeknownst to
employees or management.
Securing sensitive information
During interviews, employees were also asked if they left sensitive
information unsecured. Many admitted that there were occasions when
information was accidentally left on desks after hours, but according to
them those occasions were rare. Each attributed this to the FSS policy
that required employees to secure all sensitive information upon leaving
every night. The results of after-hours observation found numerous
violations which were potentially disastrous for FSS.
The marketing manager's office, loan manager's office,
and the accounting manager's office were all unlocked. In the inbox
on the marketing manager's desk were documents containing FSS
employee information, including employee name, social security number,
address, and salary for several employees. On the loan manager's
desk were customer profiles and lending information containing customer
names, address, telephone number, social security numbers, account
numbers, etc. There was a large locked filing cabinet in the office;
however the key was left in the lock. Upon unlocking the cabinet it was
found that the cabinet contained all the information on every loan
currently at FSS. Also, unsecured in the office was a notebook
containing the credit scoring formula for FSS. There was a green-bar
report on the accounting manager's desk which contained general
ledger numbers and descriptions. Also on the desk were loan charge-off
reports which contained all the necessary information to steal a
customers' identity. An open box containing confidential customer
information was on a chair in the corner. Finally, a folder was observed
in the inbox that contained the procedures for performing wire transfers
at FSS.
The automated clearinghouse (ACH) room and the wire transfer rooms
showed vulnerability. In the ACH room there was also a large locked file
cabinet with the key still in the lock. Inside the file cabinet was all
of the payroll information for every customer who currently had their
payroll directly deposited at FSS. These records included customer name,
address, telephone number, social security number, FSS account number,
date of birth, place of employment, and salary. There were thousands of
customer records left unsecured. The wire transfer provided completed
copies of wire transfers containing customer information, sending and
receiving financial institution routing and transit numbers, dollar
amounts of transfers, and customer social security numbers. There were
also binders containing many international wire transfers that been
processed within the last week, showing all the pertinent information
readily available. A binder containing the wire instructions was also
found, with the ID and password to the wire transfer system written
inside the front cover.
The Individual Retirement Account (IRA) area and credit card area
also had significant violations. In the IRA area a file cabinet
containing all IRA's at FSS was unlocked. IRA information was also
found on the employee's workstation. At this workstation all of the
overhead compartments were unlocked except for one. The CIO informed
that this compartment contained FSS company checks which must be kept
secure because they are preprinted with FSS's corporate account
number. However, even if accessed, these checks must be signed, which is
done using an automatic "check signer" which requires a key to
operate. Upon opening a drawer at the workstation a set of keys was
found that unlocked the overhead compartment containing the corporate
checks. The "check signer" was found in the same area with the
key still in the lock.
The credit card department contained many overhead compartments
which were all locked; however there were several boxes on the floor
throughout the room. These boxes contained reports detailing the credit
card information of FSS customers, including credit card number,
expiration date, customer name and address, and available balance; in
short, everything needed to fraudulently use the credit cards.
The results of this one hour walk-around showed that, as an
organization, FSS was not as secure as management perceived. This, along
with the results of the employee password exercise, also indicates that
management at FSS is "blind" to the security risks that are
occurring at FSS. Thus, there is a significant difference between
management's perception of organization security and the actual
levels of security that exists at FSS.
Reaction of the CEO
Upon the completion of the interviews, documentation review, and
observations, a final follow up meeting was scheduled with the CEO. He
was shocked at the level of security risks that were found at FSS.
It's surprising to the extent that we are open from the information
side. I'm really amazed that you found it that easy. The stuff
lying on the desk, there is some of that going on and ... there is
no punishment for that, but it's amazing that people are so
fearless about giving away passwords and access.
He was also amazed by the findings regarding unsecured information
and information being thrown away.
We don't have someone supervising every area to make sure we are
doing a good job ... make sure we are not putting important
information in the trash can. We don't have someone coming around
making sure the desk is clear of paperwork that has important
information.
During the initial interview, the CEO ranked the FSS's
security level an "eight" on a scale of one to ten. However,
after reporting the findings of this study, his opinion changed
significantly.
On the people side I guess it's more like a three. That's where the
exposure is.
He also felt that the findings were so significant that immediate
actions needed to be taken to correct the actions.
We need to get right on it. It's wide open. We are laying here wide
open. It's not really an IT issue.
In summing up his thoughts, the CEO seemed to finally understand
the significance of the organization security risks at FSS.
It's kind of like we are leaving the backdoor unlocked every night
after night--nothing happens--eventually something does happen. From
then on you are sure to lock you door.
DISCUSSION
This case provides support for the three hypotheses presented in
this study. Management at FSS perceived their organization security
level to high. The executives unanimously agreed that FSS had above
average security and that they were doing everything possible to protect
the organization. These perceptions were not based on the actual
probabilities of security threats, but rather on simplification
strategies developed as a result the technology-based audits and federal
examinations (Slovic, 1987). Without personal technological expertise,
managers were forced to rely on the reports of these third parties
(Siegrist & Cvetkovich, 2000). The CIO understood that the reports
focused on the technology aspect of security, for which he was
responsible. Therefore, he was pleased with the findings. Management
failed to consider the human element of organization security; "...
we often overlook the human solution and instead opt for technology
solutions, when in fact the human factor must be addressed first, with
technology assisting in the enforcement of desired human behaviors"
(Whitman, 2003). Strengthening their perceptions was the fact that FSS
had not had any security incidents (of which they were aware). If often
takes a security incident to open management's eyes to threats
within their organization (Dhillon & Moores, 2001). Therefore,
hypothesis 1 is supported.
Management did not perceive that employees were putting the
organization risk because of the established organization security
policies. Security policies have been identified by researchers as an
effective deterrence to security threats (Whitman, 2003). However,
policies only deter risk-causing behavior if employees read the policies
and adhere to them. FSS did have what was referred to as 'security
policies', however most were focused on physical security and the
threat from robbery. There was a systems usage policy that addressed the
creation and changing of passwords, as well as the importance of keeping
passwords confidential. Another general policy addressed the usage of
the shredder bins and the securing of information after hours. These
policies were on the FSS intranet which was available to all employees.
However, there was no monitoring to ensure these policies were being
read. Furthermore, there was no monitoring to ensure these policies were
being followed. Because of an overly trusting environment existing
within FSS, monitoring was felt to be unnecessary. Therefore managers
were unaware that organization security protocols were not being
followed. This supports hypothesis 2.
People will respond only to the threats that they perceive (Slovic
et al., 1980). Therefore management was unaware of the employee behavior
that was occurring, and the frequency of occurrence, which
unintentionally was putting their organization's information at
risk. Because of these perceptions, management took an ethnocentric
approach to organization security management, focusing on the threats
from untrusted outsiders and ignoring the potentially more dangerous
threats from trusted insiders (Allport, 1954). Management perceived
overall security at FSS to be high. They were comfortable with the
technology countermeasures that had been implemented to minimize
security incidents. They felt that the threats presented by employee
behavior that could unintentionally put the organization at risk were
minimal. By perceiving that employee behavior was not a threat, the
issue was not addressed by management. Management put too much trust in
their employees, resulting in a lack of supervision (Dhillon &
Moores, 2001). This study showed that management's perception of
this type of employee behavior and the actual occurrence and frequency
of this behavior was quite different. Therefore, hypothesis 3 is
supported.
CONCLUSION
In this case, the data from FSS revealed that a significant factor
in the occurrence of risk-causing behavior was managerial perception.
Therefore, to reduce unintentional security risks, management should not
seek to condemn the employees. They should look towards the actions, or
inaction, of the management staff. It is the manager's
(mis)perception of risk causing behavior within FSS and organizational
dependence on a technology-based approach that ignores the human element
of risk compliance and needs to be addressed. It is this blindness that
has led to insufficient countermeasures to protect organizational
information from the unintentional risks caused by employee actions.
Adding to the possible severity of these inadvertent employee
actions is management's perceptions of the frequency of these
actions. Managers often fail to consider the actual probabilities of
these types of employee actions or to monitor the occurrences of these
actions. Instead, management's perception is based largely on
heuristics, which can negatively affect their decision process (Slovic
et al., 1976). Therefore, the gap between management's perceptions
of the rate of recurrence of organization security risks and the actual
frequency of risk causing behavior by employees can lead to serious
threats to organizational security.
While these findings alone are important, another overriding focus
of this research is to demonstrate, in general, how managerial
perceptions might be analyzed in realistic organizational settings. Das
(2003) complained that the study of managerial perceptions occurs
without "even a modicum of appreciation of the real-world
managerial environment." In this study, we analyze the perceptions
of real-world managers who work in a real-world business which may face
the real-world consequences of (in)activity based on those perceptions.
So, not only were we able to provide perceptual clarity to the subject
organization we were also successful in affirming Slovic's
perception of risk theory (1978) as the theoretical foundation for this
study. Specifically, managers did perceive organization security risk
along two primary factors, severity of the consequence, and, the
probability of its occurrence (Slovic et al, 1976). However, these
perceptions were filtered through the lens of simplified cognitive
heuristics which resulted in their drawing inaccurate conclusions about
the true level of potential threats to organizational security. This
finding is in agreement with Mezias and Starbuck (2003) who theorized
that "managers may have inaccurate perceptions regarding
information that is central to their jobs as well as about information
they believe is someone else's responsibility." It has been
demonstrated that organizational behavior initiated because of misguided
perceptions may sometimes have disastrous consequences (Nystron &
Starbuck, 1984). Here, managerial inaction may have been disastrous as
well. However, organizations can take action to indemnify themselves
against such damage if they seek out, and correct, perceptual
inaccuracies.
This research can prove to be valuable to the practitioner
community by raising awareness of the existence of the insidious, yet
significant, problem of inaccurate managerial perceptions. By
understanding the problem, managers may seek better understanding of the
limits of their cognitive biases and devise policies and practices to
widen their perceptional base. Specifically, managers may need to
undertake training that increases individual situation analysis skills,
encourages shared situation analysis as a part of the normal working
environment, and fosters development of comprehensive information
sources from which to draw conclusions and make decisions (Endsley,
1995). Ultimately, the goal is to decrease erroneous managerial
perceptions to improve organizational performance.
For the academic community, this research introduces another method
to study questions about managerial perceptions. From studies in the
past, managers from various industries usually complete questionnaires
about variables that were objectively reported in industry and
organizational reports. Then, these cognitive assessments from managers
were compared to the objective data in those reports to gauge the
manager's perceptual accuracy (Tosi, Aldag, & Storey, 1973;
Downey, Hellriegel, & Slocum, 1975; Mezias & Starbuck, 2003).
However, the design of this study obligated the researchers to obtain
direct evidence of the situation utilizing a "hands-on"
analytical approach instead of relying on second- or third-party
business reports which might, or might not, be entirely accurate. While
not practical for all research questions, this technique increased the
validity of this effort because all of the information was generated and
controlled completely by the investigating parties. Personal interviews
of executives were performed to get precise information regarding their
perceptions of the situation of concern; after which, the actual
evidence to confirm, or contradict, those perceptions was personally
rooted out by the investigating researcher. Akin to a police
investigation, the chain of evidence in this case remained
uncontaminated and, therefore, the findings led us to untainted
conclusions. It is our desire that additional studies concerning
managerial perceptions may also benefit from this approach. Last, other
organization and information security studies utilizing Slovic's
perception of risk theory could add to a greater understanding of
management's perception of security risks.
Some limitations of this research must be pointed out. Researchers
have argued that research conducted by someone considered an
"insider" may result in subjects' willingness to divulge
information (Malone, 2003); however, the role of an insider conducting
research can also be very complex and situational (Sherif, 2001). An
'insider' familiarity with the industry can lead to a biased
interpretation of the information collected. Avoiding biased
interpretations of information was given a great deal of consideration.
Interview transcripts and observation notes were reviewed by the CIO;
however, this is no guarantee that the researcher's line of
questioning was not influenced by personal knowledge of the industry.
This research was also limited by some other ethical
considerations. For example, employees were questioned about their
knowledge of social engineering (the art of deception to gain
information) and (after explaining the definition) if they would be
susceptible to such tactics. Employees consistently believed that social
engineering tactics would not work on them. Although the primary
researcher and the CIO felt employees were naive regarding social
engineering, testing such beliefs was thought to be unethical since this
would involve intentional attempts to deceive the employees. This
scenario is regarded as substantially different from the password
gathering exercise, which was conducted by the IT department of FSS and
did not involve deception.
Regarding duplicity, there is the question of whether this type of
research should be replicated. Although researchers must walk an ethical
tightrope, there is a lot to gain from conducting research about
managerial perceptions in this manner. Employees may feel reluctant to
admit their actual (versus perceived) activities because of potential
negative ramifications from managers. Because of this, surveys and
interviews alone may not provide the necessary data to properly
understand organizational problems. Conducting case research can give a
more in-depth understanding of the phenomena; however, the researcher
must walk a fine line between data gathering and the ethical
considerations of research subjects. Thus, researchers are encouraged to
proceed with caution, always keeping key individuals informed and
constantly gaining consent for every phase of research.
REFERENCES
Allport, G.W. (1954). The nature of prejudice. Addison-Wesley,
Cambridge, MA.
Barr, P.S. & A.S. Huff (1997). Seeing isn't believing:
understanding diversity in the timing of strategic response. Journal of
Management Studies, 34(3), 337-70.
Barr, P., J.L. Stimpert, & A.S. Huff (1992). Cognitive change,
strategic action and organizational renewal. Strategic Management
Journal, 13, 15-36.
Benbasat, I., D. Gldstein, & M. Mead (1987). The Case Research
Strategy in Studies of Information Systems. MIS Quarterly, 11(3),
369-386.
Boland, R.J., P. Salipante, J. Aram, S.Y. Fay, & P.
Kanawattanachi (2001). Knowledge representations and knowledge transfer.
Academy of Management Journal, 44, 393-417.
Bourgeois, L.J. (1985), Strategic goals, perceived uncertainty, and
economic performance in volatile environments. Academy of Management
Journal, 28, 548-73.
Boyd, B.H., G.G. Dess, & A.M. Rasheed (1993). Divergence
between archival and perceptual measures of the environment: causes and
consequences. Academy of Management Review, 18, 204-26.
Bromiley, P. & K.J. Euske (1986). The use of rational systems
in bounded rationality organizations: A dilemma for financial managers.
Financial Accountability & Management, 2(4), 311-320.
Cyert, R.M., & J.G. March (1963). A behavioral theory of the
firm. Prentice-Hall, Englewood Cliffs, NJ.
Daniels, K. (2003). Asking a straightforward question:
Managers' perceptions and managers' emotions. British Journal
of Management, 14 (1), 19-22.
Das, T.K. (2003). Managerial perceptions and the essence of the
managerial world: What is an interloper business executive to make of
the academic-researcher perceptions of managers? British Journal of
Management, 14, pp. 23-32.
Dhillon, G. (2001). Violation of safeguards by trusted personnel
and understanding related information security concerns, Computer &
Security, 20(2), 165-172.
Dhillon, G., & J. Backhouse (2001). Current directions in IS
security research: towards socio-organizational perspectives,
Information Systems Journal, 11, 127-153.
Dhillon, G., & S. Moores (2001). Computer crimes: theorizing
about the enemy within, Computers & Security, 20(8),715-723.
Downey, H. K., D. Hellriegel & J. W. Slocum Jr. (1975).
Environmental uncertainty: The construct and its application',
Administrative Science Quarterly, 20, 613-629.
Earle, T.C., & G. Cvetkovich (1995). Social trust: Toward a
cosmopolitan society, Praeger, Westport, CT.
Endsley, M. (1995). Toward a theory of situation awareness in
dynamic systems. Human Factors, 37(1), 32-64.
Fahey, L. and V.K. Narayanan (1989), Linking changes in revealed
causal maps and environmental change: an empirical study, Journal of
Management Studies, 26(4), 361-78.
Fischhoff, B., P. Slovic, S. Lichtenstein, Read, & B. Combs
(1978). How safe is safe enough? A psychometric study of attitudes
toward technological risks and benefits, Policy Science, 9, 127-152.
Fischhoff, B., P. Slovic, & S. Lichtenstein (1979). Weighing
the risks: Which risks are acceptable? Environment, 21(4), 17-38.
Gigerenzer, G. & R. Selten, (2001). Bounded Rationality: The
Adaptive Toolbox. Cambridge, MA: MIT Press.
Goodhue, D., & D. Straub (1991). Security concerns of system
users. A study of perceptions of the adequacy of security, Information
& Management, 20, 13-27.
Helweg-Larsen, M. (1999). (The lack of) optimistic biases in
response to the Northridge earthquake: The role of personal experience.
Basic and Applied Social Psychology, 29, 119-129.
Helweg-Larsen, M. & J. A. Sheppard (2001). Do moderators of the
optimistic bias affect personal or target risk estimates? A review of
the literature. Personality and Social Psychology Review, 5(1), 74-95.
Judd, C.M. & B. Park. (1988) Out-group homogeneity: Judgments
of variability at the individual and group levels, Journal of
Personality and Social Psychology, 54(5), 778-788.
Lindsay, P. H. & D.A. Norman (1977). Human Information
Processing: An Introduction to Psychology. New York, Academic Press.
Malone, S. (2003). Ethics at home: informed consent in your own
backyard. International Journal of Qualitative Studies in Education, 16,
797-815.
McKenna, F. P. (1993). It won't happen to me: unrealistic
optimism or illusion of control. British Journal of Psychology, 84,
39-50.
Merton, R., M. Fiske, & P. Kendall (1990). The focused
interview: A manual of problems and procedures (Second Edition). Free
Press, New York.
Mezias, J.M. and W.H. Starbuck (2003). Studying the accuracy of
manager's perceptions: A research odyssey. British Journal of
Management, 14, 3-17.
Miles, M. & A. Huberman (1994). Qualitative data analysis: An
expanded sourcebook. Sage, Thousand Oaks, CA.
Mitnik, K (2003). Are You the Weak Link?, Harvard Business Review,
April, pp 3.
Miller, T. & L. Bell (2002). Consenting to what? Issues of
access, gate-keeping and 'informed' consent. Ethics in
Qualitative Research. M. Mauthner, M. Birch, J. Jessop and T. Miller.
London, Sage Publications, 53-69.
Murphy, K.R., & J.N. Cleveland (1991). Performance appraisal,
Allyn & Bacon, Needhan Heights, MA.
Nystrom, P. C. & W. H. Starbuck (1984). To avoid organizational
crises, unlearn. Organizational Dynamics, 12 (4), 53-65.
Perloff, L. S. & B. K. Fetzer (1986). Self-other judgments and
perceived vulnerability to victimization. Journal of Personality and
Social Psychology 50, 502-510.
Rosenthal, D.A. (2003). Intrusion detection technology: Leveraging
the organization's security posture. Information Systems
Management, Winter 2003, 35-44.
Roth, W. (2005). Ethics as a social practice: Introducing the
debate on qualitative research and ethics. Forum: Qualitative Social
Research, 6(1).
Santos, M.V. and M.T. Garcia (2006). Managers' opinions:
Reality or fiction--A narrative approach. Management Decision, 44(6),
752-770.
SBC (1999). Conference Report and Text of Graham-Leach-Bliley Bill,
S.B. Committee (ed.), Retrieved from http://banking.senate.gov/con.
Segev, A., J. Porra, & M. Roldan (1998). Internet security and
the case of bank of America. Communications of the ACM, 41(10), 81-87.
Sherif, B. (2001). The ambiguity of boundaries in the fieldwork
experience: Establishing rapport and negotiating insider/outsider
status. Qualitative Inquiry 7, 436-447.
Siegrist, M., & G. Cvetkovich (2000). Perception of hazards:
The role of social trust and knowledge. Risk Analysis, 20(5), 713-719.
Siegrist, M., C. Keller, & H.A. Kiers (2005). A new look at the
psychometric paradigm of perception of hazards. Risk Analysis, 25(1),
211-222.
Simon, H.A. (l956). Rational choice and the structure of the
environment. Psychological Review, 63, 129-138.
Simon, H.A. (1957). Models of man. Wiley, New York.
Sjoberg, L. (1999). Risk perception by the public and by experts: A
dilemma in risk management. Research in Human Ecology, 2(2), 1-9.
Slovic, P. (1987). Perception of risk, Science, 236, 280-285.
Slovic, P. (1998). Do adolescent smokers know the risks? Duke Law
Review, 47(6), 1133-1141.
Slovic, P., B. Fischhoff, & S. Lichtenstein (1976). Cognitive
processes and societal risk taking, in: Cognition and Social Behavior,
J.S.C.J.W. Payen (ed.), Lawrence Erlbaum Associates, Potomac, MD,.
165-184.
Slovic, P., B. Fischhoff, & S. Lichtenstein (1978). Accident
probabilities and seat belt usage: A psychological perspective. Accident
Analysis and Prevention, 10, 281-285.
Slovic, P., B. Fischhoff, & S. Lichtenstein (1979). Rating the
risks. Environment, 21,(3) 14-39.
Slovic, P., B. Fischhoff, & S. Lichtenstein (1980). Facts and
fears: Understanding perceived risk, in: Societal Risk Assessment: How
Safe is Safe Enough? R.C. Schwing and W.A.J. Albers (eds.), Plenum, New
York.
Slovic, P., H. Kunreuther, & G.F. White (1974). Decision
processes, rationality, and adjustment to natural hazards, in: Natural
Hazards: Local, National, Global, G.F. White (ed.), Oxford University
Press.
Starbuck, W. H. (1992). Strategizing in the real world.
International Journal of Technology Management, Special Publication on
Technological Foundations of Strategic Management, 8(1/2), 77-85.
Starbuck, W.H. & J.M. Mezias (1996). Opening pandora's
box: Studying the accuracy of managers' perceptions. Journal of
Organizational Behavior, 17(2), 99-117.
Starr, C. (1969). Social benefit versus technological risk.
Science, 165, 1232-1238.
Straub, D., & W. Nance (1990). Discovering and disciplining
computer abuse in organizations: A field study." MIS Quarterly,
14(2). 45-60.
Tosi, H., R. Aldag & R. Storey (1973). On the measurement of
the environment: An assessment of the lawrence and lorsch environmental
uncertainty subscale. Administrative Science Quarterly, 18, 27-36.
White, M.P., J. P. Eiser, & P.R. Harris (2004). Risk
perceptions of mobile phone use while driving. Risk Analysis, 24(2),
323-334.
Whitman, M. (2003). Enemy at the gate: Threats to information
security. Communications of the ACM, 46(8), 9195.
Yin, R.K. (2003). Case Study Research, Design and Methods (Third
Edition). Beverly Hills, CA: Sage Publications.
Richard G. Taylor, Texas Southern University
Jeff Brice, Jr., Texas Southern University