The Human and Organisational Issues associated with network security

abusivetrainerΔίκτυα και Επικοινωνίες

20 Νοε 2013 (πριν από 3 χρόνια και 4 μήνες)

87 εμφανίσεις

The Human and Organisational Issues associated with network
security

JISC Committee for Awareness, Liaison and Training
(JCALT)



By Andrew Cox, James Currall & Sarah Connolly
April 2001

















[Lead Institution] [Collaborating Institution]
South Bank University University of Glasgow
103 Borough Road University Avenue
London Glasgow
SE1 0AA G12 8QQ






page
1











“I would never if I could help it, put anything personal, private, or financial to me, anywhere near a
computer. I do not have confidence in computer security at all.”


“I don’t feel I have ever been told ANYTHING about IT security here! Maybe I don't use the computer
enough!”


“Too much emphasis is placed on IT support when users should shoulder the blame for most security
breaches, virus outbreaks and general non-functionality of IT equipment.”


“I am shocked at how easy it is in some cases to break IT security. There are websites to help and provide
downloads for breaking in. The potential for security breaches does increase my awareness for fair dealing,
oddly enough. IT security can be used as an excuse to hide material that ought to be disclosed. Can mean
against open communication!”


“Important subject - but I don't know much about it!”



“I can't see what can be done to stop it [plagiarism] going on.”
“An evil that should be strenuously combated, difficult as it is to eradicate it. Copying and cheating are
dishonest and dishonourable.”

[Comments collected in the project questionnaire]
page
2
Table of Contents
The Human and Organisational Issues associated with network security......................................................1
Table of Contents...........................................................................................................................................3
Acknowledgements....................................................................................................................................4
1 Executive summary.................................................................................................................................5
2.1 Introduction..........................................................................................................................................6
2.1.1 Information security......................................................................................................................6
2.1.2 Plagiarism......................................................................................................................................9
2.2 Methods..............................................................................................................................................10
2.2.1 Design of questionnaire...............................................................................................................10
2.2.2 Interviews....................................................................................................................................11
2.3 Summary of questionnaire results......................................................................................................12
2.3.1 Extended commentary on results - question by question.............................................................12
2.3.2 Conclusions.................................................................................................................................17
2.4 Interview results.................................................................................................................................19
2.4.1 Extended commentary on interviews...........................................................................................19
3.1 Conclusions........................................................................................................................................22
3.2 Creating a security culture..................................................................................................................23
3.2.1 Obstacles to creating a security culture.......................................................................................23
3.2.2 Obstacles to training users in a security culture in HE/FE..........................................................23
3.2.3 Creating a security culture in practice.........................................................................................23
4.0 Recommendations..................................................................................................................................26
Appendixes...............................................................................................................................................27

page
3

Acknowledgements

The authors of the report would like to thank everyone who has contributed to the project: those who have
taken time to fill in the questionnaire, those who were offered themselves for interview and also to those
who participated in the two project workshops and the conference.

Particular thanks are owed to the institutions who distributed the questionnaire, to Aenea Reid (University
of Aberdeen), John Henderson (University of St. Andrews), Gordon Hunt (Royal Scottish Academy of
Music & Drama), Stan Houghton and Sara Eyre (University of Bradford), Richard Eade (Nottingham Trent
University) and to Roy Adams (De Montfort University).

We would also like to thank the Steering committee: John Akeroyd, Anthony Amatrudo, Andrew Cormack,
Matthew Dovey, Thomas Lancaster and Mike Tatlow for their support and guidance.

A particular debt is owed to Andrew Cormack and to members of JCAS, particularly Alan Robiette, who
contributed so many ideas through meetings, personal meetings and email contacts.
page
4
1 Executive summary
In the second half of 2000 the JISC Committee for Awareness, Liaison and Training
(JCALT) commissioned a study of the human and organisational issues associated with
network security by South Bank University and the University of Glasgow.

Higher Education and Further Education's increasing reliance on computers and
computer networks across a wide range of their activities mean that digital security is of
increasing importance. Security can never be completely watertight, and technical
solutions do not offer the whole answer to protecting the institution or the individual from
a breach. Users' attitudes and behaviour are also critical, but little research has been
conducted into them.

The purpose of the study was to examine user behaviour and attitudes to computers and
computer security with a view to discovering how far these were congruent with good
and safe practice. The study looked at users' sense of responsibility and what they saw as
the greatest and most likely threats, their attitudes to viruses, policies, backups and
passwords. The study also encompassed some enquiry into attitudes to plagiarism, and
software piracy.

The study was conducted primarily through a questionnaire, piloted at the host
institutions, and also applied to groups of users at six 'outer core' institutions. Some more
in depth interviews were conducted. Two workshops and a conference were also held to
gather views, as well as a means of disseminating initial results.

Variations were expected between different institutions and different groups based on
their role (e.g. support staff, IT staff, students). But from the questionnaire significant
minorities showed surprising and often risky attitudes and behaviour. Respondents tended
to underestimate the affect of their actions on others. A few people do get a lot of viruses,
but there was evidence of quite widespread bad practice, and that users do not fully
understand the institutional cost of viruses. Many people seem not to read policies, and
on the whole policies seem to be viewed quite negatively. Users are confused about
backups, recognising their importance but not consistently making them. Risky password
practices are alarmingly common. Impersonation is underestimated as a risk. Users do not
fully recognise the risks associated with sending confidential information by email.
Although ignorance and uncertainty were quite widespread, users did seem motivated to
security.

The report reviews some of the possible methods for raising security awareness. In
particular awareness raising training sessions and a list of 'ten personal action points' are
presented in some detail.

The study found that users do not understand security very well. We recommend that
awareness among senior managers, IT managers and users needs to be raised. This could
in part be accomplished through a senior management briefing paper; and conferences
and good practice guides for IT managers. The report recommends further research in the
area of security awareness methods, attitudes to plagiarism and software piracy.

page
5

2.1 Introduction
2.1.1 Information security
2.1.1.1 Information security: a growing problem
As in society as a whole, in Higher and Further Education (HE/FE) the network is being used for more and
more critical, confidential and private activities, such as to:
• spend departmental budget;
• communicate with colleagues about plans;
• circulate minutes of confidential meetings;
• receive CVs from prospective employees by email ;
• track students' progress through the Web;
• access information resources licensed by the library.
This growing reliance on computers leads to a corresponding increase in importance of security.

A major ambition in HE/FE is to make more use of the network in core learning and teaching activities: to
manage the whole learning experience from payment and registration to teaching and final assessment
online. There are many points of security risk both to the institution, and to the individual learner in online
learning
i
. Impersonation may be the biggest threat for both. All this points to the growing importance to
HE/FE (as many other institutions) of information security.

The main information security risks for institutions are:
• loss of data (corrupted or lost);
• loss of work time (e.g. due to downtime) ;
• loss of IPR (theft of course material - for use by non registered students or rival courses);
• loss of reputation or as Cormack puts it 'trust'
ii
. Individually and collectively British Universities need
to protect their image in an increasingly global education market.

Measuring this risk is quite difficult in HE/FE; there is no simple financial calculation that can be made as
for a bank, of loss of potential earnings, say. More work needs to be done on how to measure risk in
HE/FE, for without clearly understanding risk it cannot be managed rationally.

This said, policy makers must be reasonable about what level of security is wanted online. There does seem
to be a systematic tendency in society to strive for watertight computer security, when insecure 'real' world
systems are seen as satisfactory. Credit card sales by phone or cash back are not very secure. In fact
examination of many online risks show that they are similar to real world risks. The main difference online
is more of scale than of kind. A student could copy learning materials to people who haven't paid for a
course, simply using a photocopier. It is just that this sort of piracy would be easier and quicker with
electronic information. It could be done on a much greater scale. So greater precautions are needed in
relation to electronic information, but strong security is costly in terms of computing power and usability,
absolute security probably impossible.

It is important to remember that digital information and network security is part of a broader picture.
Information is only secure as the weakest link of protection. A secure network and network practice will
not provide protection for printed copies of files in unlocked offices or paper copies of emails sent through
the internal mail.

Not all activities need equal protection: much of basic communication by email is not mission critical or
particularly private and so the security requirements are not very high. Strong authentication, as a global
mechanism for all authentication would be, therefore, overkill in most cases, even for mildly confidential
material. It is more effective to find appropriate technology to fix particular security problems or reduce
certain risks, taking into account the specific needs of the process, than to seek global solutions.
page
6
Implementation will be easier too. Thus if an institution wants to distribute exam results securely by email
appropriate tools should be found for the parties active in the process, and training given rather than trying
to implement the system across the entire University.
iii

2.1.2.2 A critical change: mass computer ownership
In the past the security problem in Universities has been the need to protect campus based services, in
basically a closed network, from outside threats. Often it hasn't seemed to require asking much commitment
or understanding from users, except in the area of passwords.

But as HE/FE try to deliver more courses to users regardless of whether they are on campus or working
from work or home, the challenges will change. It may no longer be possible to achieve security without
some understanding, motivation and responsibility among users. In fact as computer owners themselves,
with an increasingly varied history of experience of using computers outside their institution of study, users
may increasingly see themselves as vulnerable, under threat and have an enhanced appreciation of security.
This shift is the most fundamental shift in the environment for computing service departments in the sector.
2.1.2.3 Technical solutions
There are promising technologies that offer solutions to the problem of security on open networks:
cryptography, digital certificates, biometrics for example. These solutions are however expensive, and
certificates for example are not a mature technology. There is also a danger that implementations will
ignore realities of how users and administrators behave. However secure the technology is, if the users and
administrators do not fully understand or are motivated to get round the system, they probably will find
ways to do so.
Technology must acknowledge work practices. A good example is that computer users do share passwords,
they often need to, it is part of joint working and shared responsibility. So for policies on passwords to
always insist on not sharing will lead to a perception that the rules in general don’t make sense, and cannot
be observed. Again Sasse points to the folly of forcing users to change passwords at awkward times or
without sufficient time to think out a good password
iv
. Implementations of technology should have
reasonable expectations about what the user can and will do.

It is commonly argued that security should be 'transparent to the user', hidden away or set into the
environment. The expectation is that users don’t need much appreciation of the issues. It is often assumed
they are simply not interested or concerned with security. Information can be given out on a need to know
basis. This is a caricature of course, and there is a real case for releasing a minimum information about
many security related incidents. Publicising attacks on the University system by hackers is unlikely to help
the situation, only alarm users and may supply knowledge and satisfaction to the hacker, and possibly
inadvertently reveal vulnerabilities. Users will need to have a greater appreciation of security in the future:
the more we work online on open networks with sensitive information. The assumption that users are not
really motivated about security may well be false.
2.1.2.4 Transparent to the user security is not possible

There are many reasons why it is unlikely that HE/FE will be able to offer a watertight 'transparent to user'
security solution in the near future:
♦ Commercial off the shelf solutions are expensive;
♦ There are a large number of legacy systems to make interoperate;
♦ Low retention of skilled computing staff, discouraging investment in their knowledge of systems;

♦ Weaknesses in the technology itself either due shortcuts being taken in the software design industry
v

or the failure by software designers to understand that security requires different design features from
the user friendly design seen to be the main priority for most applications.
vi


page
7
There are two further special factors of importance in HE/FE that make delivering a watertight transparent
security unlikely:
♦ Universities have very large transitory populations of users. This makes basic authentication and
authorisation problematic.
♦ The increasing spread of mass computer ownership, meaning that full time students and staff work
more and more away from campus, at home, at other workplaces etc.

In one sense transparent security is a logical contradiction, if security means 'not letting people do things I
don’t want them to do', since this implies choice about what you want to allow. Also very heavy security
cannot prevent one simply sending an email to the wrong person by mistake, with possibly disastrous
consequences.

Users bring many more assumptions and habits about how security should be; it is essential that they be
more knowledgeable and be told more so they can make adjustments in behaviour.

While it is desirable that security solutions be as simple for the end user as possible, they will not always be
transparent. In fact there is a case for saying that 'system security needs to be visible'.
vii
An aware,
thoughtful user should be an ally. They may be able to report otherwise unnoticed events or 'near misses',
so averting more serious damage.

Hidden security has its dangers because the user is left ignorant, and gets into habits that are risky outside
the protected environment in which they have been operating. For example if virus protection is completely
hidden to the user may believe it is never a problem and thus indulge in behaviours which would be very
unwise outside the protected environment.

In many cases users need to know about security. They need to know how to rate the reliability of email -
and to assess how secure it is. They need to understand what it actually means when one uses a 'digital
signature'. The conclusion is that institutions need to tell users about security, persuade them to change
their practices and accept some responsibility for the security of all. In short universities need to start
creating a security culture.

2.1.2.5 What is known about users attitudes
If users' active co-operation to achieve appropriate security in HE/FE is essential there is a need to start by
trying to understand what users' attitudes are now. Amazingly there seems to have been very little
investigation of this, though administrators do have the practical experiences of years of observing online
behaviour.

This lack of enquiry may reflect a belief that the human side of security can be fully encompassed by the
psychology of those malicious people who attack systems. Authors often say that the human problem in
security is not the hacker but the disaffected insider. But both these share a malice towards the institution. It
may be that more important problems are ordinary laziness, ignorance and incompetence - issues that need
to be addressed by changing the culture of the whole user body, not by seeking out a few malicious
individuals. In fact the biggest problems to users may be losses of data due to failure to back material up or
picking up a virus. It may also be that these seemingly individual, limited problems are of as great
importance to users as the big network-wide problems that computing services rightly prioritise. This is
another reason for users knowing more about security.


We need to know more about:

1. Students attitudes to computers and security; and how these fit with the views of academic staff or
support staff. Differences in attitude could be points of failure in security. Cormack defines as security
culture as one where all parties share the same view of things. If this is so large variations in
perception, expectation and attitude reflect a breakdown. Actually we would expect differences of role
page
8
to create both different security needs and perceptions. One would not expect the user and the
administrator to see security in the same light.
2. Differences of attitude by academic disciplines, gender, age.
3. Perceptions of how secure or insecure different activities (email for example) are.
4. Users' greatest fears: loss of data, loss of identity, loss of privacy - what is really valuable to them.
Which are the really important failures, from the users' perspective.
5. Actual behaviour about backups, viruses, passwords.
6. What rates as the greatest annoyances or how do they rate along side 'real world' annoyances.
7. Who people trust to tell them about how to use their computer safely.
8. How much users realise that their own actions impact on others.
9. Perception of commitment of the institution to security.
10. Understanding of real but less obvious threats: e.g. impersonation.
2.1.2 Plagiarism
The second strand of the project was the examination of plagiarism. This is an issue related to computer
security in that:
• It is generally agreed that plagiarism has increased with the growth of electronic information sources,
especially the Internet, making it easy to copy and paste material into one's own work
• It raises the same issues about how to promote awareness of policies


The project set out to collect some preliminary information about attitudes to plagiarism, as the basis for
future study. One of the main aims of the questions was to try and detect differences of attitude between
students and academic staff, this might have implications for how the problem is tackled. It was assumed
that IT and support staff would probably not have strong opinions. Those engaged in learning were likely to
have the strongest views.

page
9
2.2 Methods
The project team considered the following options for conducting the research:
1. Questionnaire. This was obviously likely to be the most effective mechanism for gathering a large
number of responses from multiple institutions. In the timescale of the project it was however
unrealistic to undertake a survey of a fully representative sample of the population.
2. Semi structured interview. It was felt important to supplement the questionnaire results with some
more qualitative, open ended explorations in face to face interviews.
3. Observation. Within the scope of this project it was unrealistic to attempt any sort of observation/
experiment with users. Although the sensitivities that security arouses, and the difficulty of reflecting
on ones actual behaviour in an interview were barriers to deeper understanding of responses to a
questionnaire or interview, we did not feel that the results of this method would have made a
significant difference to the research results.
4. Reflection / Personal diary. One of the most effective research methods we used was reflection on
one's own behaviour, although we did not formalise this into keeping a personal diary, say.

The discussions of the Steering group on and off line and all the interactions within the workshops and
conferences were also a significant source of insights, and a check on the validity of preliminary findings.
2.2.1 Design of questionnaire
The project team agreed that the questionnaire should be:
1. applicable to all types of users (ie we did not want to have different questionnaires for different groups
of respondent)
2. short (to maximise participation)
3. clear
4. reusable (did not require rewording to be reused in another institution)
5. allow opportunity for free comments

We believe we achieved most of these goals, although to cover enough of the ground we were interested in
we did ask 41 questions, and many of these were multipart questions. For the text of the questionnaire as
used in all the sample, see appendix 1. Typically it would take 20 minutes to complete.
The questionnaire was trialed at South Bank with a variety of staff; then responses were gathered across the
institution and also at Glasgow. Analysis of the initial responses suggested that the questionnaire asked
questions that did elicit a variety of responses, and that the respondents did understand the questions as they
were meant.

After analysis of the South Bank and Glasgow responses, colleagues at six other institutions distributed the
questionnaires; they were asked to choose a specific group to target e.g. postgraduate students or IT staff.
In all around 300 people completed the questionnaire.

2.2.1.1 The questions
The questionnaire aimed to collect four broad types of information.
1. Factual information about the respondents, for the purpose of cross tabulating attitudes and status,
gender, age, access to a computer at home. These issues were addressed by the questions in the
Background section.
2. Opinions and attitudes about security issues. Under the headings of General, Viruses, Policies,
Backups, Passwords and User Ids and Information
3. Opinions and attitudes about plagiarism, under the heading Originality and Cheating.
4. Some factual information, such as the number of viruses respondents had suffered in the last year.

More specifically the main themes that the questionnaire sought to examine were:
1. Actual experience, such as how many times in a year one gets a virus.
2. Views on risk, what are the most likely and the most damaging threats, and who and what is at risk.
3. Sense of responsibility.
page
1
0

4. Sense of trust in institutional protection.
5. Awareness of policy.
6. Attitudes to security steps or policies the user is themselves required to observe (e.g. irritation,
acceptance, ignorance).
7. Claimed compliance with policies, best practice etc.
8. General attitudes to different types of security breach.
9. Sources of information relied on for support or instruction.

10. What are the most serious forms of cheating.
11. Why some students do cheat.

2.2.2 Interviews

Interviews were chosen as a complimentary method to the questionnaire as they can examine a subject in a
more open-ended way. Each of the interviews was based on a series of open-ended questions (see appendix
2). The interviewees were interviewed in their workplace, where they felt comfortable, and were taped for
accuracy. They were aware they were being recorded, and were told that they would remain anonymous.

Although a superior methodology in terms of depth of conclusions to a questionnaire, an interview does not
guarantee complete understanding of respondent behaviour; the interviewer is never sure whether they have
got to the real truth of what people do in practice; and they do not necessarily understand the behaviour that
is described to them. In this case the interview group was also too small to be the basis of any serious
generalisations.

page
1
1

2.3 Summary of questionnaire results
2.3.1 Extended commentary on results - question by question
In this summary we give first a commentary on the global response, followed by a look at variant responses
among particular groups of respondents. Two particular collections of respondents are referred to. Group N
were the 41 respondents from one particular institution, a mixed group of staff and students, with no
particular bias towards faculty. Group Z were the 58 respondents from another institution, chiefly a mixture
of humanities students and staff.

2.3.1.1. Background

Basic demographic data was collected about respondents: including status, faculty, sex, and age. We
wanted to explore whether there are systematic differences between such groupings and attitudes to and
experience of network security. Of the 319 respondents, 22% were academic staff, 10% were IT staff, 12%
were other support staff, 26% were undergraduates, and 33% were postgraduates. Of the two thirds of
respondents who stated a departmental affiliation, about 45% were working in the humanities and 45% in
science, there was a small group of social scientists. There was roughly an equal number of male and
female respondents. About 20% of respondents were under 25. Another 30% were between 25 and 35; and
36 and 50. 15% were over fifty.

The questionnaire then asked respondents about their basic attitudes to computers (Question 5).
Respondents were asked to state whether computers were a distraction, a necessary evil, useful tools or
essential to learning and teaching. Few gave any response other than they were useful tools or essential.
Only 14 individuals saw them as a necessary evil; and 2 a distraction.

When asked if they had computer access at home, exclusive or shared, 80% responded that they did, with
the majority (46% of all) having exclusive access (Question 6). We thought that the users' sense of
responsibility, and many of their attitudes, might be different if they regularly used a computer outside the
University. So the intention was to cross reference the response to this question to other answers.

2.3.1.2. General

In this section questions were asked about how the respondents evaluated the possible affect of their own
actions, and some general security-related issues.

First respondents were asked to rate how likely they thought it was for them to lose a days work from a
virus, disk failure, hacking or human error (Question 7). Respondents were clear in seeing hacking as an
unlikely event (80%). Around 2/3rds of respondents thought disk failure and a virus unlikely. Human error
was seen as the least unlikely cause of a loss of work. This seems a reasonable assessment of the risk.

Perceptions of the impact of one's actions
Often people underestimate the extent to which their actions even as simple users can affect others,
especially through email. Question 8 sought to measure users' perceptions of the impact of their actions,
whether it was likely to affect themselves only, immediate colleagues, others in the university and so on.
The options as worded did not offer a simple diminishing scale of likelihood, for example one's actions
affecting "many people in universities" is probably slightly more likely than it will affect most people in
one's own university.
Around half of respondents thought it likely that one's actions would affect only oneself. But a fifth
recognised that actions might affect others in the room and many people in UK universities. Few people
thought their actions likely to affect most people in their university. Group Z were unusual in opting
strongly for 'myself only' perhaps reflecting a prevailing attitude amongst students, who were predominant
in this group of respondents.
page
1
2


Irritations
Policy makers often make assumptions about what users find most irritating when using a computer: e.g.
multiplicity of usernames and passwords. When asked to rate choose the greatest security related irritation,
around a third of respondents did pick out too many IDs and passwords (Question 9). One fifth opted for
viruses. Few thought limits on what software is allowed to be installed was the main problem. The other
options (rules/regulations, lack of off campus access, lack of clear guidance on security) were selected by
around about one tenth of all respondents. Group Z were unusual in stressing other people's behaviour.
Strangely the majority of those who thought other people's actions were the main problem, thought that
their own actions were likely or very likely only to affect themselves (Question 8).

2.3.1.3. Viruses

In this section respondents were asked questions on virus scanning software, the frequency of infections,
and who they perceived to be responsible for protection against viruses.

Virus checking software is sometimes seen a too difficult or time consuming to use. The first question (10)
asked for views of virus scanning software, was it too difficult or too time consuming to use, or not
worthwhile? For all the statements there was a distinct majority of respondents who disagreed with them.
90% of all respondents disagreed with the idea that scanning for viruses was unimportant. There was also
strong disagreement with the statement that scanning was too time consuming, this was rejected more
emphatically than the view that it was too difficult or forgotten.

We asked if the University took adequate steps to protect computers and files from infection by computer
viruses. We wondered whether users would feel that their institutions were doing enough to protect them,
and where they thought responsibility for protecting information lay. There was a mixed response 40% said
yes; but about a quarter said no, with a large group of not sures. Perhaps people are not sure what the
university is doing, reflecting the widespread practice of keeping security arrangements relatively secret.
Group Z were ambivalent. The majority of respondents from Group N said no the institution wasn’t doing
enough.

We don't really know how many people are affected by viruses, so Question 12 was an attempt to collect
some indicative data about the extent to which users are actually affected by them. 30% reported that they
had had none. 60% one or two. A very small number of people had more than this. This suggests a
relatively healthy state of affairs. A large number from Group N who had indicated in Question 11 that they
did not get enough protection had had a virus. This might simply reflect one major outbreak (eg lovebug).
Looking at role based groups, IT staff stood out as acknowledging having had one virus in the last year.
Undergraduates mostly claimed to have none. Perhaps this is because the protection institutions are able to
offer to students is greater. Yet of the people who had had three or more viruses students seemed to be over
represented. However the total number of individuals involved were very small.

Around about 35% of all respondents did not routinely scan for viruses (Question 13). 20% only ran the
scanner sometimes. So a small majority are taking a risk; with only 45% claiming to use always a virus
scanner. There were distinct differences between institutions, the group from N mostly said they did not
always scan, those from Z said they did. There is a bit of a feeling of contradiction in the answers to
question 13 and question 10, where respondents were emphatic in rejecting suggestions that virus scanning
is difficult to use.

The next Question (14) asked respondents' opinion on whose fault it would be if their files got infected with
a virus. This was another question about sense of responsibility. The least popular answer was the option
that it was their own fault for not understanding (only 20% of respondents selected this). There was a fairly
equal spread among the other answers. This suggests a degree of uncertainty about who is actually
responsible for protection against viruses. Again there were variations between institutions. Z said it would
be their fault. N tended to choose the two options that laid responsibility on the University.
page
1
3


We then offered a choice of statements about what were the most significant problems caused by viruses,
and asked respondents to indicate which they agreed with. The idea behind Question 15 was to find out
who people thought is most hurt by viruses, an aspect of their sense of responsibility and awareness of risk.
The options favoured were that viruses result in a lot of extra work for individuals; and that they involve a
lot of work for IT staff. Strangely although respondents recognised that viruses caused extra work for staff,
there was little acknowledgement that this meant a cost to the University. A third of respondents saw
viruses as a minor irritation (this accorded fairly well with answers to earlier Questions 7 and 9).

2.3.1.4. Originality and Cheating

The respondents were asked to rate a group of different types of plagiarism and cheating with regard to the
degree of seriousness (Question 16). The idea behind this question was to gauge how different groups view
different offences, and show up differences in attitude especially between students and teaching staff, that
might underlie behaviour.
Three quarters of respondents thought helping someone to cheat, forgetting to cite a source for a quotation,
copying material from the web, copying a book without citing it and paraphrasing parts of a book and not
citing it were serious cheating. Submitting material for one assignment similar to something already
submitted, obtaining an essay and submitting it as your own work, doing an assignment with help from
friends not on the course and citing things one has not read were serious for about half of respondents. Only
30% thought help from people not on the course could be cheating. There is surprising variation here, with
significant minorities apparently holding unexpected views. More people seem to condemn copying from
the Web than condemn other copying, as such, which is not very logical. Group Z picked helping someone
else to cheat group N thought submitting material for one assignment already submitted for another the
least serious offences.

Causes of cheating
Respondents were then asked to chose from a set of statements about why some students copy others' work
(Question 17). Students' poor time management was the most selected answer, with about 2/3 of
respondents choosing this option. Students needing more help was also a popular choice. About 1/3 pointed
to work not being checked, low risk of being caught, the system of work and assessments encouraging it.
Few attributed cheating to how difficult the courses were, or deadlines or light punishments. Again the
range of variation is quite surprising.
Undergraduates stood out as the only group seeing the course being too hard as a factor in cheating; and not
seeing that low supervision is the problem. It was staff who blamed students' poor time management. Staff
and Post Graduates say that a cause is work not being checked, implying that they know things are not
checked, but undergraduates did not share this perception. Postgraduate and academic staff also opted for
low risk of getting caught. Undergraduate Students and support staff did not see this as plausible. IT staff
more than others chose the option that the students don’t know that it is wrong. Staff generally voted for
system of work and assessments encourage it.

Policy
We asked respondents to say if the university had a policy on copying and cheating (Question 18).
Respondents were divided roughly 50:50 between those who thought the University did have a policy and
those who thought it did not, or were not sure. On the face of it this is very surprising, although if the
question had asked directly whether there was an anti plagiarism policy there might have been a more
definite response. Respondents from group N said no; those in Z mostly said yes.


2.3.1.5 Policies
Awareness of policies
50% of respondents said they had not read the University policy, with a substantial number of don’t knows
(Question 18). Don’t know seems a slightly surprising answer - one has either done or thing or not! On the
page
1
4

whole IT staff and academic staff claimed to have read the policy. Between a half and two thirds of the
other groups 'admitted' they had not.

The respondents were asked if they knew where to find the Academic Network (JANET) 'Acceptable Use
Policy' (Question 19). 50% of respondents said yes, with a substantial number of don’t knows. Fewer
people said yes in group Z, surprising as in other questions seemed better informed and compliant to good
practice.

Only a third of respondents acknowledged having had any documents relating to computer security and a
similar number IT training (Question 22). It is surprising given that most computing departments do issue
documents, hold courses, send round newsletters that so few of the respondents were prepared to
acknowledge that they had learnt anything from them. There were local differences. At N training was seen
as the main source of information. At Z more people said they had received documentation.

Purpose of policies
Question 23 asked opinions on what is the primary purpose of University IT policies. Around 40% saw the
purpose of policies as being to make life easier for support staff. The other options got support from 20% or
less of respondents. There is quite a lot of support for the fourth option, a rather negative view of policies.
N stressed the fifth option, that policies are just there to make life easier for support staff. Z saw them more
positively as protecting the individual.

A third saw policies as a source of guidance (Question 24). But a third also saw them as making computer
use more difficult than was necessary. 80% of respondents at N thought that policies simply make life more
difficult.


2.3.1.6. Backups

A very high proportion (90%) of respondents thought back ups were important (Question 25). And the idea
that it was too time consuming or too difficult to make backups was dismissed.

50% of respondents 'admitted' that they had lost data because it became corrupted or accidentally deleted
(Question 26). All groups tended to say they had lost a file. Support staff stood out with a smaller
proportion admitting they had (despite the fact that they have much more exposure to the risk, from using
computers for a long time and more intensively than students).

In response to Question 27, 20% of all respondents said they did not do any back ups. 30% backed up
everything. A large number missed out this question (20%). This tended to suggest the idea that although
people realise the importance of backups; a lot of them do not actually take the time to go through with the
process, even after a loss.

Those who had not answered all to Question 27 were asked why they did not back up everything.
Respondents were equally split between the options, but the low scores for each suggested either that there
was another explanation or people did not really have a rational explanation for apparently irrational
behaviour.

At least half of respondents would partly attribute the blame for a loss of data to themselves for not
understanding about back ups, as opposed to not doing them (Question 29). A third were prepared to blame
the University for not explaining things. Group Z were clearer in blaming themselves; N were more split
blaming themselves and the institution.


page
1
5

2.3.1.7. Passwords and User ID's

In this section, the respondents were asked to respond to a group of statements concerning passwords.

70% of respondents saw passwords as important (Question 30). But only 56% disagreed with the
proposition that they forgot to change passwords. There was considerable ambivalence about whether it is
important to change passwords.
Nearly everyone answered these questions, suggesting that the issues were better understood than back ups,
say.

A third of respondents said they only had one ID (Question 31): difficult to believe given that the question
explicitly mentions information sources, the multiplication of passwords for which is a well known
problem. Possibly the question was misunderstood, because of the use of the word ID rather than password.
People think of the ID as the log in ID, while it is seen as a multiplication of passwords that is the problem.
On the other hand it might suggest that users already understand Sasse's distinction between key IDs and
the proliferating number of Ids required to use particular services
viii
. Undergraduates had more IDs on the
whole, perhaps because they use more diverse online services. IT staff have fewer.

The respondents were then asked a series of questions about using another's password, credit or bank cards,
email accounts, using the same password or writing it down (Question 32). In this question we were trying
to gauge how people regarded sharing passwords and other 'secure' information. Over half of respondents
admitted to sending email under another's account, writing down passwords and using the same password
for multiple services. Few admitted to using someone else's password to access an online resource. Only
20% had used someone else's credit card. Undergraduates stood out as having logged on using another's'
account. Academic staff stood out as more commonly using another's credit account, maybe a reflection of
age, they were probably using their partners' card. IT staff stood out on whether they had sent mail under
another's account. Nearly all had done this, of course they had more 'exposure' to risk having been using
computers longer.

We then asked how respondents would describe their attitude towards using someone else's account
(Question 33). One third of respondents failed to answer the question: presumably either because they did
not understand it, did not want to answer or (less likely) because they wanted to choose two options and
were directed to choose one. Of those who did answer 50% said it was a serious matter. Few thought it
often necessary.

Question 34 asked about how one would know that someone had got into one's account. Around one third
of respondents thought you might know through information in the log in screen. The other options got a
few votes. The low response rate suggests that people don’t know what to expect to find, or were not
systematically looking to check that no such violation has occurred.

After this question we asked if they had ever found out that their account had been accessed by someone
else (Question 35). Almost half respondents did not answer, suggesting that they were not sure. Only 11%
of people said someone had got into their account.

Question 36, which was only to be answered by those who had had their account breached, asked what had
allowed someone to access their account. There was an equal spread between the options. But the total
number of responses was small.

Respondents were then asked to rate a set of problems that might arise as a result of someone breaking into
their computer or email (Question 37). All the options were seen as important problems. The order of
importance with the most serious first were reading files, altering or deleting files, deleting messages,
changing passwords, reading emails, sending email under your name. If one accepts our premise that
impersonation is the most dangerous threat in reality, then there is a gap of understanding among
respondents. Respondents saw a loss of data through the deletion of files, or the loss of integrity of files by
their being altered as slightly more worrying than impersonation.
page
1
6


2.3.1.8. Information

We asked respondents to rate the value of various sources of information on using computers, including
colleagues, books, the Internet, help sheets and IT courses (Question 38). Colleagues seemed to be slightly
preferred as sources of information, probably the expected (if not the 'correct') answer. Most of the other
sources were seen as important by around half of respondents. IT Helpdesks were the exception with a
large number of don’t knows. The phrasing of the question 'to learn to use a computer' as opposed to solve
a problem would naturally reduce references to helpdesks; but the sudden drop out from answering was
slightly mysterious.

When they were asked if they felt they'd had sufficient training in computer security, of the 80% of
respondents who did answer, there was a roughly equal split between yes and no (Question 39). This points
to considerable self-doubt among users.

Finally, we asked respondents if they thought online security was adequate for online shopping, exchanging
confidential information by email, online submission of course work, and online assessment (Question 40).
40% of respondents thought On-line shopping, exchanging confidential information by e-mail and On-line
submission of course work was secure. This suggests that people do not appreciate the limits on the
security on email and the Data Protection issues; but perhaps exaggerate the risks with online payment.
There seemed to be a good appreciation of the problems of online assessment. Group Z were quite trusting
in online security, probably overly so. Overall IT staff and to a lesser extend postgraduates stood out as
most trusting in the network. Other support staff were most cautious. Academic staff and IT staff stood out
as trusting the network for confidential information.
A note on Gender differences
There were some interesting differences in attitude between the genders, although this may reflect job role
and status as much as pure gender difference, for half the females in the sample were 'other support staff';
that is administrators, library staff and so forth. Also males had more and more exclusive access (52% male
exclusive - 38% female: 15% none, 25% female). Responses did show that males more confident about
security (40a), this partly reflects that males predominate in PG and IT staff groups. Females were happier
about online shopping. More males admitted losing files (29). Fewer males thought they had had enough
training on security. There was no noticeable difference however on a key issue such as responsibility
(questions 8 or 12).


2.3.2 Conclusions
There was a wide range of responses to the questions. If security culture were defined by complete
agreement amongst everyone at an institution about security, then it would be evident that there is not one.
However variations of views are to be expected, both within institutions because of the different
perspectives of people with different roles; and between institutions because of differences in the
hardware/software environment individuals operate in. Having said that there were often significant
minorities who gave surprising answers, indicating ignorance or risky practices. But this was far from
indicating a large body of 'reckless users' as suggested by Sasse
ix
. Users themselves acknowledged their
own feelings of uncertainty and ignorance. But not a lack of motivation for security.

As regards sense of responsibility, there was a sense that users did not always understand that the
responsibility lies at least partly with them to protect themselves. There was a tendency to underestimate
the effect of ones actions on others.

A small number of respondents do get a lot of viruses, but on the whole people had few, and seemed to be
positive about the importance of scanning, even if fewer actually do it. People were slow to acknowledge
page
1
7

the cost to the University of the virus problem. There were marked differences in attitude between
institutions.

Unsurprisingly quite a few people do not read policies. Policies are seen quite negatively as there primarily
to protect IT staff. In some institutions there are distinctly negative views towards policies.

Users seem confused about backups: stressing their importance but not actually doing them consistently. A
large number of people acknowledged that they had lost an important file because of accidental deletion or
because it had become corrupted.

Risky password practices are quite common, especially writing down passwords and sharing them. Perhaps
this reflects practical realities of needing to remember multiple passwords, or shared responsibilities.

Impersonation is underestimated as a risk.

People may not fully understand the risks of sending confidential information by email; while they are
much more cautious about online shopping.

Some institutions were doing less well than others, judged by attitudes and reported problems. This
suggests that there needs to be greater sharing of best practice.

There are relatively few clear cut links between role and attitude - so blanket education would be ideal,
rather than focussing on particular role based groups.

As regards plagiarism and other forms of cheating, there were again quite a lot of people holding surprising
views. In several institutions there seemed low recognition that there was a plagiarism policy, at all. It was
interesting that staff and postgraduates thought people cheated because of low supervision, whereas
undergraduates clearly did not perceive this to be the case, and more often attributed the problem to how
hard the courses are.
page
1
8


2.4 Interview results
A series of interviews were conducted at South Bank University as part of the project. Researchers Kevin
Barry and Sarah Connolly conducted these interviews with two groups, one of a mixture of computer
personnel and academics, and another group of support staff.
2.4.1 Extended commentary on interviews

2.4.1.2 Scope of the interviews
The scope of the interview was broadly similar to that of the questionnaire, covering views and attitudes
towards computers, including home use, users’ practices in relation to backups, viruses, passwords, online
shopping, and policies. An additional question on software piracy was added in the second set of
interviews. The list of questions used in the second set of interviews is presented as an appendix to this
report.
2.4.1.3 Target population and choice of interviews
The persons interviewed in group one were computing staff personnel and some academics. The initial aim
was exploratory, and the interviews were conducted before the results from the questionnaire had been
received. In particular we hoped to discover some differences in attitude to computers between the two
groups that might explain some of the friction in the system. The sample was very small, and in fact the
academics interviewed did not have notably different attitudes to computers from the computing staff.
Group two were non-computing support staff. They had a range of usage levels of computers. To a certain
extent these staff are responsible for enforcing security policies, and advising users on the use of
computers; but they are not directly responsible for running the network or for administering/enforcing
policies. It is important that they are reasonably well informed about security.
Both groups were very small, and the intention was only to supplement the results of the questionnaire.
2.4.1.4 Group one: system administrators/ academics
The main results to come out of the interview were:
• There were few systematic difference between the groups, e.g. in attitudes to computers, except that
the academics had strong views on plagiarism, whereas the computing personnel did not seem very
aware of the issue.
• All agreed that University policies were not read or understood.
• None of the interviewees had received specific security training; but did not feel they needed more.
• All claimed to maintain good practices about keeping passwords secure etc.
2.4.1.5 Group two: support staff

Attitudes to computers
Our initial expectation was that this group would have fairly positive attitudes towards computers. But what
was found was surprisingly negative over all. Most considered computers only a means to an end, and/or
useful tools, with only one interviewee expressing a very positive attitude. This was in direct contrast to
the interviews with computer personnel, who saw computers as completely indispensable, understandably
so. Only one had a computer at home.

Virus checking
Every person interviewed was aware that a virus checker was running constantly on their work computer,
which is the case at the university concerned. Of the two who had a computer at home, only one had a
virus checker on that machine. When asked how much a virus would affect the work of the whole
university, three out of five thought the effects would be enormous or catastrophic, with the other two
saying that the University would not be affected very much, if at all. It seemed that these staff were
page
1
9

slightly confused as to how much the university would be affected by viruses. They are not aware of the
reality that computers are constantly under attack, instead of just being a victim of an occasional problem.

Security of emails
When asked about sending confidential material over email, each interviewee said that either that they
would not do so, or that they would take appropriate measures depending on the sensitivity of the
information being passed. Everyone had received unwanted emails, with only one interviewee saying they
had used email filters to try and stop it. Our initial hypothesis was that people would not be particularly
well informed about this – and were actually sending confidential information frequently; in fact this group
were actually quite responsible and well informed.

Policies
All the interviewees in group two were aware of the existence of an email and an Internet policy at the
University, and could give a general idea of their content. However, when they were asked if the policies
were read and understood by others, they were unanimous in thinking that not many people did read or
understood them, both staff and students (the first set of interviewees had the same opinion). The high
level of awareness exhibited by this group concerning policies was surprising, especially when they
themselves thought that not many others have read or understood the policies.

Backups
All of the interviewees said that they did back up files in some way. Four people backed up some things
themselves on floppy, and then let the server back up the rest. Three admitted that they did not backup
systematically themselves, and that most of the time they left the server to back up everything. One
interviewee said they backed up everything all the time because of a deep-seated fear of loosing anything.
The other interviewee never backed up anything, letting the server do it all the time. Three interviewees
had lost work through not doing a back up, including the person who never backs anything up herself.
Perhaps the most interesting point to come out of this question was the set of attitudes of the person who
backed up everything, although nothing had happened to her in the past. Perhaps we should also conclude
that contrary to the usual assumption, users are very concerned about security. Perhaps what is lacking
more than anything is clearer information on what to do.

Passwords
Four interviewees never changed their passwords, with the fifth answering “not regularly”. Three had
shared passwords in the past, and when asked if they had changed the password after sharing it, 100%
answered no. A few had comments to add about changing passwords, the main being: if they changed it
more often that would lead to writing it down in order to remember it, which would defeat the purpose of
changing it. The previous set of interviewees showed signs of not changing passwords either, although
they quoted the party line that they had been mandated to change them regularly.

Software piracy
A question was added to the interview to measure the awareness of software piracy. Two interviewees were
aware of it. When asked if software piracy was a problem within the university, three said they did not
think so, while two said it would be a problem if a) someone knew how to do it and b) if there was wider
access to the CD-ROMs. Everyone realised that there is no difference concerning copyright infringement
on either a music CD or a software CD, but there were a few interesting comments on this question. One
interviewee said that they thought the software companies factored in the costs of piracy into the product
anyway. One said they had copied music, even though they realised the copyright laws were no less strict.
Another said that they could see how people could justify copying software; the argument being that the
company would never get her money anyway, because if an application cost £800, they would not buy it,
but if it was available to copy, then they would have it. Either way, the company would not get her money,
regardless of piracy.

Plagiarism
Two interviewees were positive that plagiarism was a problem at the university, with three not being sure.
Most were of the opinion that it was not more of a problem for their university than any other university,
page
2
0

with again a couple unsure. This supported our initial hypothesis that this particular group would not be
very aware of plagiarism as an issue. The same result came out of the interviews with computer personnel.

2.4.1.6 Conclusions

The interview samples were quite small, and were only undertaken as a quick way to gather supporting
information to the questionnaire results. But they did show up some interesting sets of attitude, that would
bare further exploration.

The interviewees were concerned about security, and quite well informed in some areas.
There is a desire for security.

While denying that anyone else read policies, they did themselves.

There was a gap between knowledge of theory of good practice, and actual behaviour.
Password practices in particular are poor, at least evaluated against usual policy
recommendations. Behaviour about back ups seemed confused.

There were some more negative attitudes to computers than showed up in the questionnaire.

Plagiarism did not bulk large in support staff's agenda.

The question on software piracy through up some interesting contrasts in morality as applied
to ownership and what constitutes 'theft' in the electronic sphere, which would bear further
research.
page
2
1

3.1 Conclusions
The group of people who responded to the questionnaire was quite small. However we are confident that it
gives a true flavour of attitudes among individuals in the real world, at a number of institutions, at one
instant in time. Given the rapid change in IT and the high 'churn rate' in the student body any study,
however large and detailed, would rapidly age. For example the findings might be quite different as
Windows 2000 becomes more widely implemented. The nature of the typical computing environment in
Universities is changing.

There are significant groups of users holding surprising or risky views. Institutions need to address this.

Some areas of particular weakness of understanding appear to be:

Passwords

Backups

The findings of the study support the researchers' view that draconian security regimes are unnecessary
and, in fact, unlikely to be successful. Users are motivated to act securely. The main barriers are ignorance,
confusion and some conflict between the individual's interest in getting their work done in the most
effective way and the nature of security precautions required. It is more effective to persuade users, to offer
them help, to make it easier for them to act in a secure way. To offer the user help but not try and take over
completely: to support but propagate a feeling of responsibility. This may be a significant cultural change
for some institutions but is inline with the general trend towards customer focus in user services. The next
section discusses some initial ideas about the nature of a security culture and how it might be achieved.

Sasse takes a more radical view than this and sees users as often reckless, and incredibly ignorant and
confused about computers, but we do not find much support for this view.

Some areas requiring further research are:

Practical research measuring the effectiveness of different methods of raising security
awareness in a way that leads to a measurable fall in problems, such as virus infections or loss
of data.

Plagiarism. This study only touched the surface of attitudes to copying and cheating. Rather
than simply invest money in technical fixes to discovering plagiarism, more thought needs to
go into the underlying attitudes and structures that underlie it, the extent to which the tasks set
for students encourage cheating and how to best change them, if necessary.

Research on attitudes to ownership and property as it relates to computer software, and that
underlie software piracy (and similar theft of IPR in academic contexts).
page
2
2

3.2 Creating a security culture
3.2.1 Obstacles to creating a security culture
A survey of current attitudes is a starting point for creating a new security culture.
But creating a security culture will not be easy, because computers are difficult to understand and explain.
In fact people are probably too ready to trust them, because they don’t really understand them. Security
itself can also difficult to understand.
x
So too is the concept of Risk. Most individuals probably have
difficulty grasping how to balance the risk of a highly likely event, which would be simply an
inconvenience, and a very unlikely event which if it did happen would be catastrophic. Rare events are
often the most significant events. Fortunately the individual rarely sustain a major loss of data so it is
difficult for them to sustain day to day habits and vigilance, when their experience tells them that problems
rarely, if ever, occur.

It is probably true that people want security.
xi
They are aware of limits on their knowledge. But users also
don’t want to be hassled with it in practice. The threshold of irritation is low. For most users the task in
hand isn't about security - its only a necessary condition to safeguard achieving what the user really wants
to do.
So motivation for security is mixed.

People behave irrationally. If something goes wrong they are overcautious for a while then because they
cant sustain this effort, they drop defences. This is a classic response to security failures or other relatively
rare events such as rail or air disasters - initial avoidance followed by a return to previous behaviour.


3.2.2 Obstacles to training users in a security culture in HE/FE
There are special reasons why creating a security culture in HE/FE might be difficult:
1. There are competing priorities. There is a need for security. But there is also a need to make computers
easy to use, possibly more for some technophobic academics more than students. Universities value a
culture of openness, some risk is worth bearing to preserve this as an essential part of the culture of the
academy. More specifically policy makers are often struggling to stimulate internal information sharing e.g.
on Intranets, to also impose rigorous security may seem like a contradictory policy.
Where people are explained the risks there may be an acceptance of some inconvenience. As relatively
decentralised institutions there is less scope to impose policy on the organisation, than relatively
hierarchical commercial organisations.

2. There may be a feeling in HE/FE that there is not really anything worth protecting. Whereas businesses
such as banks are acutely aware of the value of their assets, if only because they are measurable in
monetary terms - there is often a feeling in Universities that the information in the system does isn't really
very valuable. The typical user may not really feel under threat.
xii

3. The large transitory population of users is obviously far less controllable than the relatively small stable
populations of staff in commercial companies. It would be a major undertaking to deliver security
awareness training across the whole student body. Students would probably be reluctant to spare time for
such activity, unless obliged to.

3.2.3 Creating a security culture in practice

The argument of this report is that there is a need to get over the message that users are responsible for their
own actions - and can affect others' work; that they need to be alert; understand who to contact in the event
of noticing a problem; and of course understand risks associated with specific applications e.g. email or
page
2
3

encryption techniques and when to use them. For their part, institutions will take steps to make the
environment less risky, but can only go so far.

3.2.3.1 Have a plan i.e. an information strategy
JISC made a considerable investment in promoting the idea of information strategies. Surprisingly many
institutions still do not have one. The strategy needs to adapt over time.
xiii
JANET-CERT have published a
list of links to security policies.
3.2.3.2 Training
A security awareness raising programme has been attempted by at least one American institution, James
Madison University. Madison have experimented with a number of mechanisms to increase awareness and
understanding of security issues. For example when users wish to change their passwords they must work
their way through a mini tutorial which explains some fundamentals of security through a series of
scenarios. On the whole the response of users at James Madison has been quite favourable to this
experiment (http://raven.jmu.edu/~dixonlm/quiz/), but we can envisage that many academics and students
in our institutions might take a rather different view. It is of course built into a system, so staff and students
cannot avoid participating to a certain degree and may have become used to it. James Maddison also have a
large Web site devoted to the top 10 security issues (
http://www.jmu.edu/computing/info-
security/general/about.shtml);
they acknowledge that users are not very likely to use this information.

3.2.3.3 Awareness raising sessions
As part of the project we experimented with a consciousness raising session with a selected group of
support staff. This was one in an existing series lunch time lectures, offered to staff as a CPD exercise on a
voluntary basis. It lasted around an hour.

Structure of the session
A. A short introduction to the concerns of the project. We started by reviewing the literature of computer
security and distinguishing that which concentrates on hardware and software solutions; the literature that
stresses the role of the 'insider' as opposed to hackers; and our own stress on the need for a security culture
among all users. We also discussed some key project issues and concepts - namely the effect on users of
having computers at home and computers at work. And the difficulties of the concepts of trust and risk.
B. A series of headlines and extracts from newspapers were used as jumping off points to stimulate in depth
discussion of the issues.

Basic security practice.
"Watch out there is a hacker about". (The Daily Telegraph, January 8th 2001) In this article there is advice
on backups, passwords, securing information. We led a discussion on how well people felt they, colleagues
and students performed against these 'common sense' criteria. We tried to tease out non-obvious aspects of
the issues. For example one of the pieces of advice is not to ' send sensitive information by email, except
encrypted.' This is obviously not a straight forward piece of advice, to follow it one needs to determine
what is sensitive information, and ideally identify different levels of sensitivity, one needs a strategy to
evaluate sensitivity systematically, what policies were relevant; it also begs the question whether people
know how to encrypt information, and the different levels of encryption.

Vulnerabilities.
"Thieves of the future will steal your identity". (Sunday Times, December 10
th
2000). The aim was to again
problematise perceptions of what is at risk online. What is at risk if someone breaks into your account? Is it
simply loss of data (they delete a file), corruption of data (they make small difficult to detect changes), loss
of privacy (they read your files) or loss of reputation/trust (they send messages under your identity). The
last risk is arguably the most serious, but the least obvious. We also discussed whether Universities can
measure risk to themselves, and what the University has at risk.

page
2
4

Enforcement of policies.
• "Policing online: three companies tell us how they deal with email and web issues" (Evening Standard,
January 16 2001). This article described the policies of three companies towards people who broke the
rules in some way, such as sending .exe to a colleague or a document with a virus to a business
contact. The issue to discuss here was: how strict should the organisation be, what are reasonable
punishments, why would this vary with different types of organisation or context?

Online shopping.
• "Shoppers hacked off over online dangers" (Daily Telegraph, Nov 9 2000)
• "Increasing consumer confidence in the internet and security of online shopping has uncovered a
burgeoning worldwide appetite for laver bread" (Daily Telegraph, September 21 2000)
"How best to shop securely online" (The Times, October 16 2000) To close off the discussion we raised the
perennial issue of the security of online shopping systems, perceptions of their usefulness, security and
reliability.

Success of the exercise
There was vigorous and well informed discussion on most topics, with a high level of participation and
clearly people enjoyed the event.
A few interesting points that came out of this particular: we should try increase awareness that the
University virus scanning software is available for free to all staff to install on their computers at home.

This was only a one off experimental event. Clearly to be effective the event would need to be tied to more
direct instruction. But it seemed like a simple effective (if somewhat unscaleable) way to increase
understanding of the issues. Realistically larger institutions might find this solution impractical. Obligatory
attendance at a course would probably not be popular. The cost of running such an event across many
departments would be great.
3.2.3.4 10 personal action points
A simpler, arguably more cost effective way to promote security awareness might be to use tools such as
the '10 personal action points' document used at University of Glasgow to increase awareness of immediate
security practice. The text of this example can be found at
http://www.gla.ac.uk/Compserv/Doc/Polocy/ITsecact.html
3.2.3.5 Style of policies
Typically computing security policies are lists of commandments written in rather legal language. They
exist partly to protect the institution against malefactors. But the way they are presented makes it unlikely
that most users will ever read them. Ironically they may well have little legal validity as a result. But if
users are to be encouraged to read and understand policies it may be necessary to review their wording.
3.2.3.6 Better non-technical explanations
There needs to be more work done on explaining security related issues in non-technical language. Janet
CERT have published a number of easy to understand explanations of security technologies, including on
viruses and backups.
3.2.3.7 Honor codes
Another possible means of enforcing a culture of security is an 'honor codes' i.e. establishing that if a
student does not blow the whistle on fellow students who break the rules, they are themselves equally
culpable.
3.2.3.8 The Health and Safety model
It may be that HE/FE can learn a lot about how to create a security culture from the model of creating
Health and Safety awareness.
xiv
page
2
5



4.0 Recommendations
Online security is currently not very well understood by users. It will become increasingly important that
they do understand; and take on responsibility for the security of the information over which they have
custodianship and also for the 'safety' of themselves and others. There is a clear role for the JISC to
promote discussion and exchange of information about users' understanding of security, and established
best practice in promoting it.

It is important that awareness is raised at all levels in institutions:

Senior management need to understand the importance of a security culture in safeguarding
institutional Intellectual Property and business continuity and in protecting the institution
from litigation as a result of breaches of confidentiality or failure to comply with relevant
legislation.

IT managers, at all levels, need to understand about both 'physical security' and that the
implications of security breach stretch well beyond compromised machines, into matters
related to the information stored and processed.

Users need to understand more clearly than was apparent in our work, their own responsibility
for data and its security and that their actions can seriously compromise the work of their
peers/colleagues and the institution.

The most effective way to do this might be through:

A senior management briefing paper.

A conference, or strand in a JISC conference exploring different experiences, and seeking to
create a permanent online forum to share best practice and stimulate understanding among
policy makers.

A handbook of good practice guides for IT managers, distilling the best practice across the
sector, both in the UK and abroad.

JISC should consider further research in this field:

Practical research measuring the effectiveness of different methods of raising security
awareness in a way that leads to a measurable fall in problems, such as virus infections or loss
of data.

This study only touched the surface of attitudes to copying and cheating. Rather than simply
invest money in technical fixes to discovering plagiarism, more thought needs to go into the
attitudes, practices and structures that underlie it, and how to best change them.

Another area that might reward study is attitudes to ownership and property as they relate to
computer software, and that underlie software piracy, and similar thefts of IPR in academic
contexts, including online information. These attitudes may broadly be associated with
plagiarism.
page
2
6

Appendixes
Appendix 1 full text of questionnaire
http://litc.sbu.ac.uk/jcalt/qu.html
Appendix 2 interview questionnaire
http://litc.sbu.ac.uk/jcalt/qui.html
Appendix 3 workshop reports
Notes on the Security in Online Learning Environments Workshop
23rd October 2000, South Bank University
http://litc.sbu.ac.uk/jcalt/workshop2/workshop.htm
Notes on New Security Technologies: Human and Organisational
Aspects Workshop, South Bank University, 21st August 2000 http://litc.sbu.ac.uk/jcalt/workshop1/
Appendix 4 conference report
Creating a security culture in HE and FE.
24th November, University of Glasgow
http://litc.sbu.ac.uk/jcalt/conference.html
Appendix 5 Bibliography and links
http://litc.sbu.ac.uk/jcalt/links.html




i
S.M. Furnell. "A Security Framework for Online Distance Learning and Training", Internet Research, vol.
8, no.3, pp236-242.
ii
Andrew Cormack "Why security culture matters" presentation at Creating a security culture in HE and
FE. Conference, 24th November Glasgow. http://www.ja.net/CERT/Cormack/Culture.pdf
iii
Andrew Cormack. op.cit.
iv
Anne Adams and Martina Angela Sasse. "Users Are Not The Enemy: Why users compromise security
mechanisms and how to take remedial measures". Communications of the ACM, 42 (12), pp. 40-46
December 1999.
v
Paul Taylor. Hackers. 1999.
vi
Alma Whitten and J.D. Tygar, Why Johnny Can't Encrypt: A Usability Evaluation of PGP 5.0.
Proceedings of the 8th USENIX Security Symposium, August 1999.
http://www.cs.cmu.edu/~alma/johnny.pdf
vii
Anne Adams and Martina Angela Sasse. op. cit.
viii
Martina Angela Sasse. 'Users are not the enemy' presentation given to the creating a security culture
conference, Glasgow, November 2001.
ix
Martina Angela Sasse. op. cit.
x
Andrew Cormack. op.cit.
xi
Anne Adams and Martina Angela Sasse. op. cit.
xii
Anne Adams and Martina Angela Sasse. op. cit.
xiii
Alan Robiette. Developing an information security policy.
http://www.jisc.ac.uk/pub01/security_policy.html
xiv
Information Security Forum. Information Security Culture - a preliminary investigation. 2001.
page
2
7