DETAILED SURVEY RESULTS
This is an informal survey. As one might expect, this report looks specifically at what the 522
respondents to this year's questionnaire had to say. In looking at this data, certain inherent
constraints on interpretation should be born in mind.
First and foremost, this isn’t a random sample of all the people in the country who are
ostensibly responsible for the security of their networks. Rather, there is almost certainly a
skew created by the fact that this is the CSI community—members of the organization and
those who move in its orbit (attending paid conferences and the like) without necessarily
being members. It’s a community that is actively working to improve security. This pool, in
short, doesn't stand in for the organizations in the United States that are simply not paying
attention to security (and there are, unfortunately, all too many such organizations).
But an important question that we in the security field must have a ready answer for is this:
Do current best practices produce results?
In a profession filled with (often quite justified) concerns about what will be different and more
insidious about the next round of attacks, we must also take time to consider what the run-ofthe-
mill, present-day attacks look like and whether we’ve done anything worthwhile to keep
the attackers at bay. While much of the news in the information security field isn’t
encouraging, there’s arguably some fairly good news with regard to how practitioners who are
making a concerted effort are faring against commonplace threats such as computer viruses.
And while we’re not surveying the world at large, there’s reason to believe that changes in
survey results over time reflect changes in the CSI community. Five thousand surveys are
sent out and 522 were received back, meaning there was a 10 percent response rate. That
level of response is quite respectable, but the question requiring judgment is that of
whether those who chose to reply were markedly different that those who did not.
Even if you imagine that those not answering the survey are altogether different in some
way from those who do, it’s interesting to note that the demographics of the respondents
have remained very stable over the years, as has the basic makeup of the CSI community.
2008 CSI Computer Crime and Security Survey
4
We feel confident that similar groups complete the survey year over year. And, indeed, the
vast majority of the questions yield virtually the same statistics year over year. The answers
that have changed have been primarily the estimates of losses to cybercrime and we've seen
them both rise and fall dramatically.
One could argue, as some have done, that security professionals simply don’t have a clue how
badly they are beaten down and robbed by their hacker adversaries. If that’s the case, then
their estimates of financial loss should simply be ignored. Our view is that this can only be the
case if we take a needlessly dim view of the intellect of our peers. They almost certainly don’t
have an exact and accurate reckoning of losses due to, say, a denial-of-service attack (there’s
no standard way for arriving at such a number, so how could they?). But to say that they don’t
notice when their business is crippled due to such an attack is fear-mongering.
For our part, we think the rough reckoning of seasoned professionals is nevertheless worth
attentive consideration. When the group says they lost less money this past year than they
lost two or three years ago, we think it means they lost less money.
About the Respondents
The CSI survey has always been conducted anonymously as a way of enabling respondents to
speak freely about potentially serious and costly events that have occurred within their
networks over the past year. This anonymity introduces a difficulty in interpreting the data
year over year, because of the possibility that entirely different people are responding to the
questions each time they are posed. There is, despite that concern, real consistency in the
demographics year over year.
As figure 1 shows, organizations covered by the survey include many areas from both the
private and public sectors. The outer ring shows the current year's statistical breakdown,
while the inner rings show the prior years. There is a fair degree of consistency in the
breakdown over the past three years, though there have been some shifts due to the
addition of new categories (military and law enforcement) last year.
2008 CSI Computer Crime and Security Survey
5
The sectors with the largest number of responses came from the financial sector (22
percent), followed by consulting (15 percent), information technology (9 percent), and
health services (7 percent). The portion coming from government agencies (combining
federal, state, and local levels) was 13 percent (down 4 percent from last year) and
educational institutions accounted for 7 percent of the responses. The diversity of
organizations responding was also reflected in the 10 percent designated as “Other.”
Figure 2 shows that the survey pool leans toward respondents from large enterprises.
Organizations with 1,500 or more employees accounted for a little less than half of the
responses. As the chart shows, the percentages of respondents from the various categories
remained very close to this question's breakdown in 2006 and 2007. That breakdown clearly
favors larger organizations, at least compared to the U.S. economy as a whole, where there
is a preponderance of small businesses.