PROFESSIONAL COMPUTING

20
PROFESSIONAL COMPUTING THE MAGAZINE OF THE AUSTRALIAN COMPUTER SOCIETY APRIL 1991 THE POWER TO PROTECT 0=Hb Ik? (rn WBSSMBSBBMi . 'inverter' RESERVE HELP CD 0 STEP UP 5TART J STOP J RESERVE INVERTER aeon unE svsrsms i

Transcript of PROFESSIONAL COMPUTING

Page 1: PROFESSIONAL COMPUTING

PROFESSIONAL

COMPUTINGTHE MAGAZINE OF THE AUSTRALIAN COMPUTER SOCIETY APRIL 1991

THE POWER TO PROTECT

0=Hb

Ik? (rn

WBSSMBSBBMi

.

'inverter' RESERVEHELP CD 0 STEP UP5TART J STOP J RESERVE INVERTER

aeonunEsvsrsms

i

Page 2: PROFESSIONAL COMPUTING

EDP Auditors Association, Region 8 Conference, 1991 National Convention Centre Canberra, 15-17 May 1991

EDPAC 91

Objectives★ to promote the discipline of Information Systems Audit as a profession★ to share practical, technical and management knowledge about

auditing information systems★ to address the differing interests of auditors, management, computer

security and the data processing community

Who should attend?★ Information technology managers★ Data security officers★ Auditors and Accountants★ Systems analysts and programmers

Topics include:★ Data security★ Computer abuse★ Automated controls★ Unix & open systems

Further information:EDPAC91 Conference Secretariat Conference Solutions PO Box 135 CURTIN ACT 2607

ph (06) 285 3000 ------- --- -fax (06) 285 3001 I == ===

Major Sponsors

Page 3: PROFESSIONAL COMPUTING

Have you noticed how small it is?

YOUR magazine relies on only two sources of revenue, an ACS monthly payment and ad­vertising sales. Like all of the IT industry papers we are much smaller than usual and small­

er than we would like because these sales are very hard to make at this time.

You may like to read the covering letter I am sending out with some promotional literature.

Dear (agency person),Information Technology is rarely the reason for

the existence of a business and IT management can have a particularly hard time bidding for a share of corporate funds unless it has established a good track record of efficient contribution to the company’s success.

When times are easy, buying decisions can be made with a fair degree of confidence that the results of wrong choices will dissipate; if a really cynical view is taken, the consequences can be escaped by moving to a new job — there have always been plenty of them.

Today it’s different, IT managers’ decisions need thoughtful evaluation, serious consideration of options that have become technically difficult to understand and demand more than ever the involvement of specialists within their depart­ments.

Professional Computing magazine is read by the 12,000 members of the Australian Computer Society, decision makers and influenc­es who belong to the organisation because they recognise the importance of keeping up to date with fast moving technologies, for industry inter­action to discuss and influence developments and for a professional approach to important work. There is no formal requirement for people to join the ACS; although there are good reasons for certi­fication of professional competence to operate in the industry; only members prepared to pay a non-trivial annual fee to belong to an organisation whose aims are all related to doing the IT job better.

Confidence that value will be received for an investment is one of the most effective answers to the problems of budgetary constraint, and our readers are the professionals equipped to supply that confidence. It surely follows that Profession­al Computing is the magazine most likely to en­sure value from an advertising investment.Yours sincerely,

A.H.Blackmore, Editor.

Do you believe this? Can you influence a deci­sion? Help us to convince the advertisers that my message is true and you’ll get value in the form of a bigger, better magazine.

Tony Blackmore

PROFESSIONAL

COMPUTINGCONTENTS: APRIL 1991SERIOUSLY, SECURELY, COMPUTING: Data communications and computing security solutions have been driven by Defence needs. In this two-part article the issues are overviewed and the implications for Australian Government and private industry are drawn. 2

IT SYSTEMS SECURITY: A BROAD VIEW: In the context of Information Technology, general use of the term ‘security’ is vaguely defined. Everyday interpretations include maintaining privacy, the prevention of data loss or the prevention of data corruption. Regarding security as the protection of the information assets of an organisation, is a definition which includes the everyday usages. 6

GROWING PERSONALLY AND LESS SECURE: The replacement of the once ubiquitous ‘dumb terminal’ by the personal workstation has created new local security risks and provided better tools for the accidental or deliberate destruction of host information. 8

OPENING MOVES: Real-time Unix discussed. 10

REGULAR COLUMNS ACS in View 13:

PROFESSIONAL

COMPUTING

COVER:Clean Line Systems ia a division of Chloride Power Electron­ics, a multi-national that is one of the world’s top three manu­facturers of power support products for computers and critical electronic equipment.

Clean Line supplies and services sophisticated uninterruptable power supplies sourced from its manufactur­ing sites around the world as well as Australian manufactured power line conditioners.

The Clean Line product range maintains electrical supply integrity for computers from personal to office through to large mainframes.

Meeting the needs of the computer industry for the last 10 years Clean Line continues to offer the most comprehensive power product range and service support available throughout Australia and New Zealand.

PROFESSIONAL COMPUTING, APRIL 1991 1

Page 4: PROFESSIONAL COMPUTING

Seriously, securely, computingData communications and computing security solutions have been driven by

Defence needs. In this two-part article the issues are overviewed and the implications for Australian Government and private industry are drawn.

A.G. Kerr

WESTERN society is now well into the information age with individuals and organisations increasingly relying on Information

Technology (IT) systems. This depen­dence is often invisible, sitting behind the preparation of a TELECOM invoice, aiding the preparation of a chemist’s prescription or assisting the manage­ment of air traffic around congested air­ports. IT systems are becoming more interconnected leading to exponential growths in computer to computer ex­changes of information. At one end of a spectrum, IT system failures have the capacity to embarrass an individual and, at the other end cause large organi­sations to cease operations. Failures in safety critical software have led to loss of life.

Effective control of information in many organisations has changed. Previ­ously managed by the users of the now- replaced manual systems, control is now in the hands of IT experts who often identify more with the technology than the aims of their organisation, and who rely on vendors for advice on the strengths of a system’s security features. Weaknesses will not be so identified.

The defence environment has general­ly been more advanced than other areas of government and private industry in demanding adequate security, within a limited scope. The US and, more recent­ly, European and Australian Govern­ment defence organisations have been determining security requirements that vendors and builders of systems must meet to be able to sell those markets. Many of these requirements are applica­ble to non defence sectors.

This paper explores the attributes of secure IT systems and maps them onto evolving standards and requirements emanating particularly from defence arenas.

Technology advances and security One of the earliest computer network-

PrOFESSIONAL COMPUTING, APRIL 1991

ing experiments occurred 50 years ago when G Stibitz from AT&T Bell labora­tories installed a computer constructed of telephone relays that could be in­structed to perform a complex calcula­tion by any one of three Teletype termi­nals located on various floors of the Bell building (1). Today a personal computer connected via Local Area Network (LAN) or modem to interconnected Wide Area Networks (WAN) allows the rapid world-wide transmission of infor­mation. Much of this information would be textual rather than numeric, and much of it could be classed as sensi­tive. The security of an IT system can no longer be addressed solely within the physical boundary of an organisation.

It is necessary to reflect on where we have come from, and the rapidity with which the technology is moving beyond where we are now: security contemplat­ed for today’s systems may be hopeless­ly inadequate in a few years time.

Some stages in the growth of impor­tance of security include:a. early long distance communication

involved bonfires and smoke signals from local high spots. Weather condi­tions that allowed availability of the communication medium also removed any semblance of confidentiality;

b. with the arrival of the electronic and computing era, the mass movement of data onto computers, during the 1960s, was via desks of Holerith encoded cards that were manually ferried between the user and the cen­tral computing installation. With ap­propriate manual procedures, and physical computer centre protection, confidentially of information was rea­sonably assured. System availability was always a problem, and integrity of software and data often suffered as card decks were dropped and shuffled;

c. during the mid to late 1960s simple terminal star networks were placed

throughout organisations, using ven­dor specific communication facilities, often with limits on cable lengths. This was the era of centralised main­frames and star networks with very localised and identifiable network boundaries. Information security was not a concern, primarily because the bulk of the information belonged to the then traditional number crunch­ing area, and was retained within the walls of the organisation;

d. the 1970s saw the arrival of long dis­tance computer controlled communi­cations, and the movement of text information onto computer data bases. Networks used proprietary pro­tocols, with inter-working between the offerings of different vendors be­ing very complex. The boundaries of networks were still in control of the network managers. The arrival of the local area network resolved many of the availability issues within an office organisation, allowing ease of move­ment of terminals and, through proto­col translation, the movement of in­formation equipment from different vendors;

e. the 1980s was the decade of develop­ment and implementation of commu­nications standards. Primarily driven by the International Standards Organisation’s Open System Intercon­nection (OSI) model, with major con­tributions from IEEE, CCITT and other bodies, it became relatively easy to connect together numerous hetero­geneous networks including LANs, WANs, and Metropolitan Area Net­works (MAN) on a variety of media and network topologies. Much work remains to be done in this arenas, par­ticularly within the higher levels of the OSI model;

f. The 1980s also saw the arrival of the personal computer and its connection as an intelligent workstation to re­mote mainframe hosts, and local de­partmental computing nodes. A per­

2

Page 5: PROFESSIONAL COMPUTING

sonal computer can now provide more computer power and informa­tion storage on an individual’s desk than existed in centralised main­frames in the 1960s. The PC phenom­enon has revolutionised the availabil­ity of computer power. It has been disastrous for the integration and confidentiality of information;

g. already in these early years of the 1990s, individuals can have on their desk massive stores of information and easy access to other systems. Hardware, software and communica­tions advances are removing the last technological barriers to information availability; and

h. the coming years will see implemen­tations complying with standards at higher levels of the OSI protocol stack and a continual expansion in network bandwidths fuelled by ISDN, FDDI and other emerging technologies. The mass movement of information be­tween locations will continue to be­come easier, faster and, perhaps, less controllable.Until very recently, the technological

progress has single-mindedly increased computer power and connectivity thus dramatically enhancing the capacity to share information. Within many organi­sations, copying corporate information onto floppy disc is easier than copying a few pages of paper. With increasing OSI led connectivity, and ease of addition of modems to personal computers, net­work managers are faced with the al­most intractable problem of managing unbounded networks.

Information that was once in filing cabinets and under the custodial eyes of co-located owners of the information is now computer managed in a way that is analogous to leaving the filing cabinet open a busy highway intersection, with the added attributes that changes made to the information, or copies taken, of­ten cannot be detected. With voice com­munications, the party line has long since disappeared; with data communi­cations, the LAN is a party line on which a user with the right knowledge may listen to all passing traffic.

Information security(i) Generic Aspects

A secure or trusted IT system is one in which:a. confidentiality is maintained by the

prevention of unauthorised disclosure of information;AND

b. Integrity is maintained by the preven­tion of unauthorised amendment or deletion of information;AND

c. availability is maintained by preven­

tion of the unauthorised withholding of information or resources. Attainment of appropriate Informa­

tion Security (INFOSEC) requires activ­ities in a number of specialised areas:a. computer security (COMPUSEC);b. communications security (COMSEC),

which includes:i. transmission security

(TRANSEC); andii. cryptographic security;

c. compromising or emitted radiation security;

d. personnel security;e. procedural security; andf. physical security.

Substantial analysis is required to se­lect appropriate measures as, for exam­ple, one that offers affords greater avail­

ability may reduce confidentiality. While the proliferation of personal com­puters has dramatically increased com­puter power availability, confidentiality and integrity of PC-managed informa­tion is generally well below the level of that attained in well-managed main­frame environments.(ii) Computer Security

While mature operational procedures exist for mainframe environments, rig­orous attention to security issues has yet to evolve. In specialised areas, of which defence is one, confidentiality issues have received substantial attention but in general there is a lack of engineering discipline in tackling security. This

Continued page 4

The Guardian Is In Town. Viruses Beware!

Some experts predict that computer viruses could infect as many as 160,000 machines by 1992.What are these rogue programs? They are software programs designed to hide in your system and do things that you do not want to happen, they can do anything from simply printing a smart remark on your screen to completely destroying all of the data and programs on your system, as well as rendering your computer system inoperative.The designers of modem disk operating systems have made available programming hooks on which specially designed software can interface to the operating system adding addi­tional features where required (TSRs). Unfortunately the disk operating system has no way to tell the difference between an honest or dishonest TSR and must presume that all TSRs are for the systems own good.Unscrupulous programmers trying to prove to the world their programming abilities have unleashed a barrage of Trojan Horses, Logic Bombs, Worms and the like (Viruses) — Aids, Disk Killer, Friday 13th, Jerusalem-B and Stoned II to name a few.To date there are many Software Virus Detection programs which generally scan a suspect Disk or Diskette looking for a fingerprint (a known binary pattern that is unique to each logged and documented virus). This method has several disadvantages as follows:—1) The computer system must be up and running with a current version of DOS which may or may not be infected. If the system is infected then intelligent TSR viruses may intercept the scanning or reporting process.2) The viruses that will be detected must be known by the scanning software in advance, and for this to take place the virus must have existed, wreaked havock, been studied, documented and implemented into fingerprint tables at some operators expense (usually many units become infect­ed before identification).3) Intelligent viruses have been known to change their fingerprint pattern when replicating themselves.THE ALTERNATIVE! A hardware solution that does not rely on DOS or fingerprint tables — hence the VIRUS GUARDIAN is bom. This hardware solution (to deceptive software trickery) is in total control of all your Hard Disk Drives (HDD’s) and Floppy Disk Drives (FDD’s) even be-, fore DOS is loaded, protecting the system against:—1) Infected DOS bootups from both HDDs & FDDs.2) Non-DOS infected boot loaders from FDDs (games)3) Illegal writes to the Boot Sector, Partition Table &

Directories4) Illegal writes to files protected by the File Filter

Extension Table set up by you5) Additional security by Password Access to enter the

computer system6) On the fly Keyboard — Paswoord protection.

Recommended Retail Price $349SOLE AUSTRALIAN DISTRIBUTOR

Questwill Enterprises“Questwill” Crooked Lane,

North Richmond NSW 2754 AUSTRALIA Ph (045) 71 1508 Fax (045) 71 1314

PROFESSIONAL COMPUTING, APRIL 1991 3

Page 6: PROFESSIONAL COMPUTING

Seriously,securely,

computingFrom page 3

problem is compounded in the PC envi­ronment where the rapid spread of viruses indicates a lack of security awareness by users which is unfortu­nately coupled with poor appreciation of the extent of corporate information that is managed by PCs.

The increasing use of computers in safety critical areas is necessitating re­views of software development method­ologies. Leading to the introduction of formal techniques, mathematical mod­elling of software algorithms seeks to guarantee the integrity of system opera­tions. However, no discipline yet exists that is widely followed in the IT com­munity.

As further corporate productivity gains are sought, IT systems are increas­ingly managing and communicating le­gally enforceable documents. The fore­runner was Electronic Funds Transfer. A new entrant is Electronic Data Inter­change which seeks to substantially re­duce the amount of paper that need flow between corporations to consummate various agreements. Thus integrity mea­sures must provide non-repudiation fa­cilities whereby a sender of an electronic document cannot deny having sent it, and a recipient cannot deny having re­ceived it. Where appropriate, confiden­tiality must be maintained, and systems must continue to provide the required level of availability.

Many organisations view system availability from the perspective of ter­minal response times, employing ana­lysts expert at tuning systems. Little planning addresses long term non-avail- ability of IT systems. Only a few organi­sations, such a banks and others typical­ly with large on-line terminal networks, have realised their total dependence on the availability and survivability of their IT systems, and have therefore devel­oped and tested contingency plans.(iii) Communication Links

Communication links facilitate inter­change of information. While asynchro­nous baudot communication is still in use in isolated areas, the move to OSI compliant communication protocol suites is apace. Resulting from excellent collaboration by the standards bodies, the OSI protocol stacks have substan­tially removed communication integrity concerns. Error free delivery of informa­tion in the order in which it was sent,

OSI Security ServicesSERVICE OSI a\/er

1 2 3 4 5 6 7Peer entity authentication v'*Data origin authentication V* v'

Access control service v'Connection confidentiality V V"*

Connectionless confidentialitySelective field confidentiality

Traffic flow confidentiality v'Connection integrity with recovery V*

Connection integrity without recovery /Selective field connection integrity /

Connectionless integrity / 1/ v'*Selective field connectionless integrity v'

Non repudiation - origin v"Non repudiation - delivery

Indicates the layer that could provide the serviceNOTE: Layer 7 services could be provided by an application

and without missing or duplicated data, is the normal expectancy.

Although a rearguard action, the de­velopment of standards for the OSI se­curity services, as shown in figure 1, is progressing and will, when implement­ed, provide confidentiality, authentica­tion and non-repudiation services.

As shown in figure 2, the mechanisms that will provide many of the OSI secu­rity services rely on encipherment tech­niques. Such techniques must provide assured protection and the cipher trans­form processes must be achievable at acceptable cost. While the symmetric Data Encryption Standard (DES) pro­vides encipherment of recognised strength and is readily implemented in VLST chip technology with acceptable transform rates, its usage, controlled by the US National Security Agency (NSA), is limited to a small number of applica­tion areas. Non-symmetric key schemes, like the public/secret facilities of the RSA algorithms, which are required to provide repudiation services and allow the transmission of transaction keys, are compute intensive and not suitable for high speed transaction-based enciphper- ment.

Successful wide-scale implementation of the OSI security services requires fur­ther encipherment research.

The OSI security architecture address­es the secure transport of information between layer 7 entities, but does not address the general security issues with­in a computing node. The computing node is the equivalent of the Chubb safe and needs to have similar strong securi­

ty facilities to manage sensitive and clas­sified information.(iv) Assurance

The security features of a system are not apparent during normal operations. That they are present and working is reflected in no failures of security. There are no objective measures of perfor­mance. It is the absence of accountable events that reflects their correct perfor­mance whereas the correct performance of applications and systems software is tested each time it is used. Thus there are problems in assuring that the securi­ty features are performing as required. When acquiring systems from vendors, it may be hard to obtain information about system deficiencies; only the sys­tem’s strengths are documented and available for perusal during the acquisi­tion process. Achieving a secure system requires:a. the use of an information security risk

analysis and management methodolo­gy to identify the security functiona­lity that is required; and

b. a process that establishes a level of confidence in the correct functioning of the eventually implemented securi­ty measures.

(v) Information Security Within a De­fence Environment

The Australian Department of Defence’s mature procedures for the manual handling of classified material have evolved over many decades. They are based on:a. clear marking of each document con­

taining sensitive information with a

4 PROFESSIONAL COMPUTING, APRIL 1991

Page 7: PROFESSIONAL COMPUTING

Mechanism

^^VV r /V /oSf *£ J? ^<& <f* rf' ^°

Peer entitv authentication V** — 7" —

Data oriain authenticationAccess control service ✓*Connection confidentiality v"Connectionless confidentiality v' v-"Selective field confidentiality v"Traffic flow confidentialityConnection integrity with recoveryConnection integrity - no recovery v-Selective field (connection) Integ. 1Connectionless integritySelective field connectionless integ. v"Non repudiation - origin v" / v'Non repudiation - delivery ! V ! v"1

OSI Services and Mechanisms

classification (top secret, secret, confi­dential, etc.);

b. vetting and then authorising staff for access up to one of the classification levels; and

c. only allowing access to information if there is a demonstrated need to know. Many staff are cleared to access confi­

dential material, while only a small per­centage has authorised access to top se­cret information. Stringent procedures surround all accesses to and the manage­ment of classified material. For exam­ple, mandatory characteristics of storage containers are specified in the Depart­ment’s security manuals, and much clas­sified material is individually numbered and audited.

Communications security has been well practised, particularly during and since the second world war. On-line en­cipherment facilities encrypt all infor­mation on a link thus providing protec­tion against traffic analysis. As a corollary, these facilities also prevent the use of packet switch networks such as TELECOM’S AUSPAC. Defence data communication networks essentially comprise point to point encrypted links connecting secure locations. Terminat­ing equipment for links carrying more highly classified information must be in­stalled to exacting technical standards to prevent unintended radiation of unencrypted information along power lines, earth leads and the like. For simi­lar reasons, computer and other equip­ment may need to comply with stan­dards specifying maximum permissible levels of electromagnetic radiation.

The need to maintain strict and audit- able confidentiality of classified materi­als its uncomfortably with the current computing and communications tech­nology that promotes availability and sharing of information.Conclusion

This paper has skimmed across many issues in presenting a view that, as ad­vances in Information Technology in­crease the ease with which information may be manipulated and communicat­ed, greater attention must be directed at security issues both within government and industry. A top level view of the OSI security architecture has been placed in context with the present and evolving security philosophies of vari­ous government agencies. It is argued that the competitive nature of interna­tional marketing makes it imperative for Australia to develop an IT security eval­uation and certification facility.References1 “Computing has come a long way

since Stibitz first demonstrated networking in 1940”. Pacific Comput­ing Weekly, 28 September, 1990.

2 “Department of Defence Trusted Computer System Evaluation Crite­

ria”, US Department of Defence, DoD 5200.28-STD, December 1985.

3 “Trusted Network Interpretation, Na­tional Computer Security Centre, NCSC-TG-005, July 1987.'

4 “UK Systems Security Confidence Levels”, UK, Communications Elec­tronic Security Group, Government Communications Headquarters, CESG Computer Security Memoran­dum No. 3, February 1989.

5 “Evaluation Levels Manual”, UK De­partment of Trade and Industry Com­

mercial Computer Security Centre, V22— Version 3.0.

6 “Security Functionality Manual”, UK Department of Trade and Industry Commercial Computer Security Cen­tre, V21 — Version 3.0.

Tony Kerr is the Principal Consultant with Brundish Pty Ltd specialising in the strategic integration of information technology . Tel: 018 367 588

This paper was presented at the Commu­nications 90 conference organised by the Institution of Engineers, Australia.

PROFESSIONAL COMPUTING, APRIL 1991 5

Page 8: PROFESSIONAL COMPUTING

IT systems security: a broad viewIn the context of Information Technology, the term ‘security’ is vaguely defined.

Everyday interpretations include maintaining privacy, the prevention of data loss or the prevention of data corruption. Regarding security as the protection of the information assets of an organisation, is a definition which includes the

everyday usages.Alan Conrad

TO ADDRESS security properly, an organisation’s needs for confiden­tiality, integrity and availability with respect to its information assets

must each be considered. In today’s business, IT systems, their software and the data they contain are an organisa­tion’s lifeblood and must be protected in a planned and systematic manner.

Security however, is not another sys­tem to be added to the list of debtors, payroll and inventory control. Nor is it a feature, plugged into the feature list for version 2.0 like a graphical user inter­face. Security needs to be part of the organisational culture, considered at ev­ery level of the IT business function.

Security at the organisational levelThe maintenance of proper security re­quires the analysis and management of risk with the effect of reducing security risks to acceptable levels. While the most common approach is to imple­ment a disaster recovery plan, that plan is often implemented without a system­atic assessment of the requirements that have given rise to it. The implication of this simplistic approach is that the lack of a good match between the security requirement and the disaster recovery plan countermeasure, will lead to higher than necessary costs and/or reduced ef­fectiveness.

A preferred approach to security is to start from the top working down through the systems, first cataloguing the information assets in need of protec­tion, second identifying risks and vul­nerabilities and last choosing the opti­mum countermeasures.

Such an approach which can be te­dious, is greatly aided by a methodology which allows the process to be carried out in a systematic manner.

Assets; the databases, applications software, the hardware necessary to pro­cess the data and communicate and even the accommodation requiring pro­

tection should be catalogued. Each asset can be assigned a value to the organisa­tion, or a loss that will be incurred through a breach in security. Losses could be incurred through the informa­tion being divulged, or the required computer system being down for an ex­cessive time. The effect of downtime will be different on an accounts receiv­able system when compared to a retail point-of-sale system.

Having listed the assets, the threats and vulnerabilities should be identified and analysed with the assets to reveal the risks faced by the organisation. The benefit of examining risks in this way is that they can be directly and clearly con­nected to the organisation’s business functions. For example, the business ef­fect of a communications failure be­tween two branch offices becomes clear­ly visible.

The final step in the process is to choose a set of countermeasures that match the business risks. Some will be discarded because the cost outweighs the risk, others can be implemented with a

clear vision of the protection of assets.One such countermeasure might be

the implementation of a disaster recov­ery plan. If it is, you will clearly under­stand its objectives and a corresponding budget.

Security during systems developmentThese processes are designed to be ap­plied to the existing systems in an or­ganisation. With the development of new systems, the opportunity to weave security requirements into the fabric of the design presents a luxury for the sys­tems designer. All too often, people in­volved in the development of IT sys­tems have made statements about the need to consider security early in a pro­jects’s lifecycle, without understanding either the importance of security of all. of the areas to be considered.

In the past, system developers, when they have been faced with the question of security have tended to refer back to previous projects and lift the relevant chapter without considering whether all the measures are relevant to the new

Development Phase Security Activity

IS Strategy Security Policy

Feasibility Study High Level Review

Current System

Required System Application Countermeasures

Technical Options Technical Countermeasures

Physical Design Procedural Countermeasures

Testing Test Compliance

Live Operation Change Control and Security Administration

6 PROFESSIONAL COMPUTING, APRIL 1991

Page 9: PROFESSIONAL COMPUTING

system, or whether additional protec­tion is required.

In certain projects the need for securi­ty may be the overriding concern. For example, it is of paramount importance that a flight control system is constantly available and accurate, and this need for security will influence the design of the system.

Also, the choice of design may affect the choice over what is proper security. For example, while it might be enough to identify users by passwords if the sys­tem is centralised, a distributed system may need a stronger user identification mechanism.

It is important not to overlook the cost when examining the feasibility of a system. Some security mechanisms, such as encryption, can be very expen­sive in both capital and running costs. These costs must be included in the esti­mates made during a feasibility study.

Certain measures, particularly those which affect the design of an applica­tion, would either be extremely difficult or impossible to implement after the system had been developed. For exam­ple, if the audit requirements have not been properly considered it can mean that either insufficient information is re­corded to allow incidents to be detected, or too much information is recorded

and it becomes impossible to review an incident with the resources available.

The objective of considering security during the development of a system is to allow the system developer to identify all the proper measures needed to pro­vide adequate security. This must be done in a consistent and comprehensive fashion. This way will eliminate loop­holes and inconsistencies that could oth­erwise have existed in the system’s de­sign, and lead to a more secure system.

A structured approach which helps the system developer identify where the potential dangers lie will lead to a clear­er understanding of the need for securi­ty, and enable effort to be directed to the areas of most need.

By considering security in this man­ner the system developer is presented with a new view on the data, forcing him to consider not only how the infor­mation can be used, but also how it can be abused. This new view will strength­en the developer’s understanding of the system and lead to an improved design, not only in terms of security but also the general quality of the system.

If security is considered during devel­opment the review process can be quick to complete, because all the relevant people and information are gathered to­gether in one place.

When should security be consideredHaving explained the reasons why secu­rity needs to be addressed throughout a project’s lifecycle the next question to be resolved is at precisely which steps of the project lifecycle should security be examined.Security should be considered at the strategy planning stage and this can form the basis of a corporate Security Policy. This will help define organisa­tion wide standards and policies, and provide a framework into which indi­vidual system security policies fit.

During the feasibility study a high lev­el review of the project should be con­ducted. This will highlight where there is a requirement for security measures which have a major effect, either in terms of: capital costs, running costs, organisational effect, and restrictions on functionality.

When the technical specification is being written the systems developer must specify the security mechanisms that the technical support environment needs to provide. This can be a complex and technical area, in which few systems developers have wide experience, but is essential that the controls built into the application cannot be compromised.

Continued page 8

Tomorrow’s solutions today...Blocklt! - DDP's Australian developed security software provides:

Blocklt!

ACF2RACF

^ S Blocklt! for DOSBlocklt! for Windows

...total end to end workstation security.Head Office - Melbourne77 Southbank BoulevardSouth Melbourne 3205Tel: (03) 694 6711 Fax: (03) 686 9036Int Tel: +613 694 6711Int Fax: +613 686 9036

Sydney: 17 Atchison Street St Leonards NSW 2065 Tel: (02) 906 1200 Fax: (02) 906 1290

Perth: Level 3, 83 The Esplanade South Perth WA 6151 Tel: (09) 474 2455 Fax: (09) 474 1616

Products of DDP (NZ) LtdLevel 6, Waterside House 220 Willis Street Wellington NZ Tel (04) 856 630 Fax: (04) 848 511

Distributed Data Processing Pty LtdA.C.N . 005 511 517

PROFESSIONAL COMPUTING, APRIL 1991 7

Page 10: PROFESSIONAL COMPUTING

Growing personally and less secure

The replacement of the once ubiquitous ‘dumb terminal’ by the personal workstation has created new local

security risks and provided better tools for the accidental or deliberate destruction of host information.

Tom Armstrong

Broad view of security systems

From page 7

During the physical design stage one of the tasks the systems developer is faced with is to specify the procedures which the users should follow. These procedures must include all of those re­lated to the security of the system.

The subject of testing, and the related subjects of quality and assurance are complex areas in their own rights, suf­fice it to say that the extent and rigour of the testing plans will be influenced by requirements for security.

Once the system has become opera­tional there is a need for the security mechanisms to be maintained and ad­ministered. Further development of the system should be controlled under a change control procedure, which would ensure that changes to the system do not invalidate the security policy or increase the requirement for security without proper consideration of how such changes should be made.

Weaknesses of the current development methodologiesThe approach taken by many develop­ment methods is to consider security as an isolated step at which the systems developer tries to prepare the complete list of the countermeasures that the sys­tem needs. This step is often carried out very early in the development of a sys­tem, often before major decisions which may affect the proper level of security have been taken.

Although some methods provide guidance on the issues which need to be considered, this is not comprehensive and some important aspects of security may be overlooked. Furthermore, it does not help the difficult task of justify­ing the cost to management.

Although each development method can be considered as a tool-box, contain­ing several techniques which allow the systems developer to model the infor­mation that will be processed by the system, the system development meth­ods generally lack tools to define the need for security within the system.

In essence, the system developer needs to add the new technique of risk analysis and management to the tool­box of techniques that they already use.

Alan Conrad is a consultant with Technology Australia Pty.Ltd. Tel:03 841 9733

OVER the last 10 years we have seen the personal computer evolve from its humble begin­nings as a stand alone device with little

memory and little storage into a corpo­rate workstation that rivals the mini computers of today.

It is often connected via complex lo­cal area networks to other PCs and fre­quently linked to super mini or main­frame equipment.

One of the MIS conundrums of the 1990s is how to bring these PC work­stations back under central control while still giving end users the facilities they want or need.

While the principal subject here is se­curity, it is worth taking a brief look at a slightly wider picture.

The rapid introduction of the PC as the corporate workstation gives rise to a number of issues not encountered when using host connected terminals. Among these are the need for increased storage space on both the PC itself and the serv­er and the difficulties of linking these PCs to each other and into host systems. These technical difficulties pale into in­significance when we take a look at the management problems evolving.

It is no secret that management is un­der pressure to reduce costs while main­taining or increasing service levels. It is also no secret that there is an increasing number of users requiring access to more information from more locations. As the installed base of workstations in­creases, there is an ever increasing need for support, therefore more people are required.

A major concern then becomes one of human resources. The simple answer is to add more support people, but an or­ganisation can only add so many ground or field staff before it is necessary to introduce further levels of expensive management.

It is uneconomic and logistically diffi­cult to introduce new software and up­grade the old by means of sending out floppy disks to users; central manage­ment of these system is difficult to im­

plement in any form; asset management becomes a nightmare; and, individual workstation security is extremely diffi­cult to implement.

Many of the applications are critical to the conduct of a business but without the facilities for direct management con­trol over systems there can be no assur­ance of integrity or compatibility of data so vital to a company’s need to retain or increase its competitive edge.

We will now take a specific look at the problems of implementing security. Computer security in the 90s

It can be quite difficult to define the meaning of the word security in the modern business environment. To some it means physically securing machines against theft but in this instance we will leave this definition to be dealt with by those responsible for the physical pro­tection of premises.

At MIS level we can give the word security two primary meanings. Firstly it means reliability (protection against accidental problems), and secondly it means defensibility (protection against deliberate misuse).

In both cases, security should mean peace of mind — a confident feeling that the system is secure, that it will continue to operate without difficulty and that no unauthorised use is taking place. When security is working best, it is an invisible protector.

There is no doubt that the widespread introduction of the personal workstation has led to the probability of a serious threat to the prime asset of any organisa­tion — its information. This prime asset is the organisation’s company informa­tion. Mainframes accessed by dedicated terminals are not nearly as vulnerable to intentional or unintentional misuse as is the personal workstation.Mainframe computers

Host mainframe computers were accessed by terminals which were well controlled as to the information avail­able to the operator of that terminal.

With IBM mainframe equipment, the

8 PROFESSIONAL COMPUTING, APRIL 1991

Page 11: PROFESSIONAL COMPUTING

terminal user was (and still is) isolated from the operating system by the operat­ing system itself. In addition, most sys­tems had a security product such as RCAF or ACF2 which provided a barri­er that prevented unauthorised access to the company’s information database.

On top of all this, the terminal was not user programmable so that any in­formation obtained legitimately from the database could only be subject to pre-determined manipulation. For in­stance, if a user was intent on destroying information in that database, it would have to be carried out one record at a time. Therefore major destruction or manipulation of company information became a time consuming chore which led to considerably increased chances of detection.Personal computers

The PC was for many years just what the name suggests — a personal com­puter. It stood alone and by virtue of its limited storage capacity it held impor­tant but comparatively limited company information.

The PC is always subject to ‘finger problems’ as DOS is very flexible by nature and does not provide the con­trolled environment seen in the main­frame. It has always been simple for the user to access and delete or change oper­ating system and data files — either in­tentionally or inadvertently. Programmable workstations, LANs and WANs

The first major change in the order of things came with the availability of ter­minal emulation software as a terminal linked to the host. While the host securi­ty provisions were still in force, they became somewhat less efficient owing to the processing power of this PC.

Record access was still controlled as with a terminal but while with a termi­nal it might be necessary to use a num­ber of keystrokes and still only access a single record at a time, programs avail­able on the PC could allow the user to program a situation in which the ma­chine would very quickly and automati­cally manipulate as many records in the company database as he wished — with­out any user intervention.Where are the real security problems?

One of the first concerns that comes to mind is the problems that can be caused by hackers, and as the Australian Computer Abuse Research Bureau based at the Royal Melbourne Institute of Technology has shown, these are very real concerns.

However I believe that the most seri­ous threat comes from untrained or careless users. The occasional reports of

large scale sabotage are certainly a con­cern but by far the largest problem lies in protecting the user against actions that may be unintentionally initiated.

Although there are very few out to intentionally wreck a company’s re­cords, there are millions of users in a position to destroy information without any intent to cause harm.Security is about asset protection

Decisions on computer security need to be based on a business premise centred around the protection of the or­ganisation’s prime asset.

As we have seen, mainframe security

is a well matured environment and host terminals are very difficult to manipu­late.

In the PC area whether using terminal emulation in the host environment or connected to networks, there is often a wide open (if unannounced) invitation to the user to misuse the equipment — intentionally or by accident.

Ideally the user should be allowed the least possible privilege to complete the necessary work.

To do this, a PC-based security sys­tem must be able to completely isolate

Continued page 12

PATROLThe Active UNIX Security System

UNIX security is good. C2 certification makes it better. But the best security available is when you have

PATROLPATROL limits access to the times and terminals

you designate.

User Access ControlsTerminal Lines Number of LoginsTime of Access Idle Time

Security ReportsActive Alarms Full Audit Trail

PATROL - the 24-hour 7-day UNIX Security System

** Ask about our Secure Menu System too **

RJM Systems Pty LimitedU2-118 'HUavera ^Road, lnc"rl»"r’•".«* 1,1 *«"" Ws»“North Rvde 2113

Tel: (02) X7X 5032 l av: (02) ,X7X 5472

I

PO Box 1826 Macquarie Centre 2113

PROFESSIONAL COMPUTING, APRIL 1991 9

Page 12: PROFESSIONAL COMPUTING

APRIL 1991

Real UNIX and real time merged in Modcomp’s REAL/IXA

LTHOUGH conventional wisdom sees Unix and real-time as mutually exclusive, there is a clear move towards a Unix which will

satisfy the needs of real-time users. Be­cause few suppliers have been willing to accept the technical challenge of a full merger of true UNIX with true real-time performance, many interim approaches have evolved. These will become obso­lete with the prospect of an effective merger.

The trend has already been endorsed by the various UNIX standards organi­sations, with POSIX, through its 1003.4 real-time subcommittee, leading the way. Within three to five years, the major computer manufacturers will all offer real-time capabilities as standard features of their general purpose UNIX platforms.

They’ll do this to increase their pres­ence in traditional real-time markets, but they’ll also do it because the basic technology of real-time, when fully merged with UNIX, dramatically im­proves the performance of UNIX in other applications. The move away from proprietary solutions and towards open-systems based products is by far the most powerful driving force in the computer industry today. Faced with es­calating software costs and the need for complex multi-vendor systems, today’s users demand standards-based solutions.

Traditional real-time markets include process control in chemical and petro­chemical manufacture, utility and energy plant supervisory control and data acquisition, primary metals pro­cessing and material handling, and dis­crete manufacturing applications in the automotive and aerospace sectors. Oth­er important aerospace applications in­clude wind tunnel model monitoring, data acquisition and control of deep space telescopes, jet engine testing, te­

lemetry, and pre-launch monitoring of space shuttle engines.

Historically, these real-time markets have been served by proprietary systems suppliers with optimised hardware and software architectures tailored to meet stringent real-time requirements. But because the advantages mentioned earli­er for standards-based software are equally applicable in real-time environ­ments, standards are evolving — speed­ing the merger of UNIX and real-time. Systems designed to support real-time applications must respond to external, asynchronous events within a predict­able timeframe. Whereas “real-time” is often interpreted to mean “fast”, a bet­ter synonym is “predictable” or “deter­ministic”. With this definition, it be­come clear that the technology inherent in making a system “real-time” can be of of significant benefit in any applica­tion that requires a predictable response. Many of the requirements discussed be­low are being incorporated into the PO­SIX 1003.4 real-time standard, and are not yet part of standard LINIX oflferings.

Real-time systems must support asyn­chronous I/O to maximise predictabili­ty. This goes a step further than just making I/O “fast”. It actually allows concurrent execution of other portions of the application while an I/O opera­tion is taking place. Predictable I/O completion is important in some appli­cations.

Applications must be able to ensure data is actually written to a file or sent to an external device, instead of waiting in a data queue or buffer. This is a key requirement for some online transaction processing (OLTP), for instance.

The concepts of priorities and pre­emption are important in real-time en­vironments. High priority tasks must be able to preempt execution of a lower priority task whenever they need access to system resources. A key measure of

real-time systems is their worst-case de­lay to the start of execution of a high priority task, in response to either an externally generated event or the completion of a prerequisite operation.

Deterministic performance is greatly improved when a system supports preallocation of system resources such as memory or file space to high priority tasks.

Real-time timers and synchronisation mechanisms are needed to allow the scheduling of events and the predictable tracking of elapsed time. Separate pro­cesses must be able to communicate, and to reliably synchronise their execution.

IEEE’s 1003.4 group is developing the minimum changes to the 1003.1 specifi­cation to meet these requirements while ensuring the portability of real-time programs. The committee addresses 10 topics that are deemed necessary to achieve this goal.

AT&T, OSF; and X/Open have all committed to following the 1003.4 stan­dard in their implementations of real­time functionality. AT&T’s UNIX System V, for instance, includes some of the real-time extensions, but unfortu­nately does not yet achieve true real­time performance for reasons that will be discussed later. The 1003.4 POSIX group has the support of all real-time vendors. A final specification should be issued in late 1990 and approved by mid-1991. For companies using UNIX that have not been immersed in the challenges of real-time prior to release of this new standard, fully debugged imple­mentations of the real-time functiona­lity will take at least a full year. This means that the merging of UNIX and real-time will not be complete until 1993, except by those few vendors who have been appropriately focused.

The traditional real-time markets can of course be well served by a true real-

10 PROFESSIONAL COMPUTING, APRIL 1991

Page 13: PROFESSIONAL COMPUTING

time implementation of UNIX. The technology of real-time systems design, coupled with the portability advantages of UNIX, will also serve other non-real­time markets.

UNIX is already the defacto standard operating system in many communica­tions products. With the merging of real­time functionality into UNIX, these products will be able to adhere more closely to standards, and olfer increased capability.

OLTP applications benefit from real­time techniques such as synchronous I/O, priority-based scheduling, and fixed resource allocation including the ability to lock processes into memory for deterministic response. Real-time UNIX provides these tools, while re­taining full access to the expanding range of UNIX commercial software.

Real-time UNIX supporting multiple threads of operation through a single kernel can optimise the performance of the Ada language. Accordingly, UNIX will be used to address problems in sim­ulation, military command and control, and other government and aerospace markets that require the use of Ada.

The true integration of UNIX and real-time is the ideal approach to pro­viding this capability, but several inter­im approaches have evolved over the past few years:

Many designers have simply moved real-time functionality off-line to a sepa­rate embedded real-time executive ker­nel. This forces a user to learn and use two different environments — one for development and another for execution.

Examples of real-time executive ker­nels include VxWorks, by Wind River Systems; C Executive, by JMI Software Consultants; and VRTX by Ready Sys­tems. They are not built from standard AT&T UNIX — they are proprietary kernels with UNIX-like interfaces.

As UNIX and real-time continue to merge over the next few years, this ap­proach will begin to lose its appeal. Even in applications that require a scaled down kernel or diskless applications, de­signers will be better served by a scaled down POSIX-compliant true real-time UNIX kernel than by one of these non­standard executives.

The fully featured real-time UNIX will be used both for development pur­poses and as the host of a network of diskless embedded systems, which will themselves be running a scaled down subset of the same real-time UNIX.

These diskless real-time executives are not the only real-time solutions that provide UNIX interfaces without start­ing from the standard AT&T UNIX source tree.

An ambitious approach is to develop a full-featured real-time operating sys­tem and make it UNIX-like in terms of

Opening Movessystem calls and interfaces. This allows the design of full preemption into the kernel from the ground up, but necessar­ily impacts compliance with standards. Application programs developed in compliance with the SVID may not run on such systems, and binary compatib- lity is even more at risk. Systems based on this approach also have difficulty conforming to new releases of UNIX.

This approach has been used in LynxOS, by Lynx Real-Time Systems; Regulus, by Alcyon and SBE Inc, and RTU by Masscomp and Concurrent Computer Corporation. Clearly, as UNIX and real-time merge, these ven­dors will be hard-pressed to maintain a competitive advantage over a true real­time UNIX. Adherence to standards and compatibility concerns make the best approach one that starts from a true AT&T UNIX source. The desired de­gree of real-time performance can lead to different implementation paths, each a subset of the next.

Adding any number of the real-time extensions postulated in the POSIX 1003.4 draft ensures standardisation, but does not provide true real-time per­formance. AT&T’s System Y4, for in­stance, incorporates some 1003.4 real­time extensions. But it does not achieve deterministic real-time performance. This approach must be coupled with some form of kernel preemption to pro­vide acceptable response improvements.

The POSIX 1003.4 draft specifies sev­eral preemption points, at which the UNIX kernel must accept an interrupt and perform a context switch to a high priority task. Various designers have adopted this philosophy, and some have added more preemption points than those specified by POSIX.

A true real-time system must provide worst case predictable and deterministic response. When preemption can only occur at certain points in the kernel, worst case response is unacceptably high — in the millisecond, rather than the achievable microsecond range.

The design goals for MODCOMP’s REAL/IX were full SVVS compliance along with binary compatibility with Motorola’s UNIX V/68, true determin­istic real-time performance, and compli­ance with the emerging POSIX real-time standards.

MODCOMP’S engineers started from the standard AT&T UNIX and Motor­ola V/68 source trees. Then, they imple­mented full preemption and countless code refinements into the kernel, along with a robust and consistent data struc­ture protection scheme.

REAL/IX conforms fully to AT&T’s

SVID and passes the SVVS. It also re­tains full binary compatibility with Motorola’s V/68. The hardware-inde­pendence required to support applica­tion portability is therefore assured. In addition, REAL/IX satisfies all of the topics being addressed by the evolving POSIX 1003.4 real-time standard. It will of course conform fully with the final standard — MODCOMP is involved with the committee and has hosted some of their meetings.

REAL/IX supports preallocation of memory resources, and incorporates a fixed priority process scheduler to en­sure predictable response for high priori­ty tasks. I/O is also scheduled based on the priority of the originating task. The system’s timers have 1 /256th second resolution, and it offers enhanced inter­process communications including shared memory, binary semaphores, and enhanced signalling.

The file subsystem has been signifi­cantly enhanced. While familiar UNIX access methods are of course supported, the enhancements are of great value for real-time applications. I/O is performed in priority-based order. Asynchronous I/O is supported, and programs can by­pass the buffer cache to guarantee reliable synchronous reads or writes. Larger block sizes of up to 128k bytes provide faster throughput. As well, con­tiguous file space can be pre-allocated to ensure fast storage availability.

External I/O is also improved. A user program can perform direct I/O with external devices, and can be directly no­tified of an external hardware interrupt. The small computer systems interface (SCSI) standard is used, and MODCOMP simplified its administra­tion — allowing the user to easily add new devices in highly configurable real­time environments. Another unique fa­cility allows users to install new system calls and real-time device drivers into the REAL/IX kernel without access to source.

Standard UNIX benchmarks have shown the REAL/IX implementation to offer significant performance improve­ments in any application environment. The faster internal performance of the system speeds I/O, system call handling, and the operation of tools and compilers by a factor of two to one in many cases.

As a result, REAL/IX offers the best of both worlds. It merges standard com­patible UNIX with true real-time per­formance, providing context switch la­tencies form 80 to a worst case of 360 microseconds. This is in the range of the proprietary real-time operating systems! A full development environment is therefore completely integrated with a true real-time environment. Commer­cial applications can coexist with real­time.

PROFESSIONAL COMPUTING, APRIL 1991 11

Page 14: PROFESSIONAL COMPUTING

Growing personally, less secureFrom page 9

the user from the operating system and in addition it must place that user in the same position as if a host terminal is in operation. This means that access to the organisation’s information is at all times on an controlled authorisation basis. The system must address both the envi­ronmental issues and authorisation re­quirements necessary to implement ef­fective information asset protection.

We will now take a closer look at some of the issues that occur in local area networks.

Physically, the difference between a local area network (LAN) and more tra­ditional forms of communication be­tween computers is that a LAN uses one cable for connecting between several dif­ferent computers. Traditionally comput­ers had a single point-to-point connec­tion between two machines.

In a LAN it is therefore immediately obvious that the problems of overall se­curity are greater than with point-to- point connections. In a LAN, if one part of the shared cable becomes insecure, the whole LAN becomes insecure. If one cable becomes insecure in the tradition­al point-to-point scheme, only the con­nection between the two computers is affected.

Also with a LAN, every connected computer is a potential point of insecu­rity. From the viewpoint of reliability, every computer connected to the LAN can cause problems and therefore affect the network. From the viewpoint of data security, every computer connected to the LAN is a listening device and also a potential originator of false data.

In a traditional point-to-point system, only a fault in a central computer can affect the whole system, and terminals only receive or transmit data when ac­tively permitted by a central machine.

A LAN still has most of the potential security problems of a traditional cen­tral computer system. In most LANs there is also a central computer, a serv­er, which holds vital data or operating instructions required for the correct op­eration of the overall system.

Unauthorised access to central com­puters can be just as dangerous with a LAN as with a traditional system, and both kinds of system often have modem connections through which outsiders might gain access.

Therefore all traditional methods of safeguarding computer security are

equally applicable to LAN-based sys­tems. These include passwords, user verification, encoded files, and access auditing. The difference with LANs is that the physical method of transmitting signals leads to a special new set of prob­lems.Ethernet and token ring

The leading LAN standards, Ethernet and Token Ring, have completely differ­ent methods of using a shared network cable.

In the case of Ethernet, the cable con­taining the network data passes by all the computers in the network. In fact in many Ethernet cabling schemes the data passes through a connection at the back of a computer even when the computer is switched off and is not connected to the network.

Ethernet’s concept of operation de­pends on this easy access to all points on the cable. Computers are expected to listen in to the cable to check that it’s not busy before they start transmitting data.

Computers which are active in a To­ken Ring LAN listen to the network to see if any frames are addressed to them, and there are some special machines which read all frames from the network even when they are not the addressees. However computers only join the LAN after they had identified themselves and they can be denied access rights to enter the network.

Token Ring LANs contain sophisti­cated network management features at each node. These are built-in to the Me­dium Access Control (MAC) protocol layer. One of its features is that each computer announces itself when it wish­es to join the LAN, and if there is a central ring manager computer (known in MAC layer terminology as a Ring Parameter Server) present, then this manager computer can prevent the new computer from joining the LAN.

After a computer is on a Token Ring, it regularly sends out a message to indi­

cate that it is active, and will also re­spond from the MAC layer if it is checked by the central ring manager. The ring manager can, if it wishes, send an instruction to remove a particular computer form the LAN.

Each computer also sends out mes­sages on the LAN if it detects an error either in itself or in a neighbouring com­puter on the ring. These messages can be collected by the ring manager (acting as an ‘error monitor’), to be displayed to a network administrator for the purposes of fault finding. In extreme cases of LAN faults, a node on a Token Ring will remove itself automatically from the network and re-test both itself and the lobe cable, and it will then only re-join if the tests pass with no problem.

The ring managerToken Ring therefore contains inte­

gral features in its hardware and MAC layer protocols to make a network more secure, both in terms of detecting and preventing unauthorised access and in

limiting the damage caused by faults on the ring.

Some of these features are self-heal­ing. A faulty node can automatically re­move itself from the network, and all nodes have the ability to re-start without causing any harm to higher-level opera­tions on the network if faults only last for a short space of times. In fact a Token Ring LAN stops momentarily each time a new node joins or leaves the network, but this causes no practical dif­ficulties.

Other items require centralised moni­toring and management. Serious errors on the network which cannot be self- cured need the intervention of a human network administrator — who needs in­formation on what has been happening on the network, and tools with which to take action. Attempts to gain unauthor­ised access to the network can be reject­ed by a central ring manager, and can

No Central Controls

SUPPORTOVERHEADS

Central Controls

WORKSTATIONS

12 PROFESSIONAL COMPUTING, APRIL 1991

Page 15: PROFESSIONAL COMPUTING

cause an alarm to be sent to the network administrator.

For security control, the network ad­ministrator can set up a list of author­ised network nodes and the ring manag­er will prevent unauthorised nodes form joining the ring. In case a computer suc­ceeds in by-passing this network-joining check, the ring manager detects the un­authorised nodes and removes it later. These measures cope with a serious security problem of most LANs.

As said earlier, the best kind of securi­ty is the security which is never noticed.

^If the LAN is trouble-

free and you can be confident that your data will not be falling into the wrong hands, then you have ideal LAN security. ^

However, this is only possible if the LAN is continuously maintaining its own integrity, and if a LAN administra­tor has the tools to manage the network.

With the advent of the personal com­puter as the corporate workstation, the need for security has expanded way be­yond the view that the most dangerous threat is from the hacker who is intent on malicious damage. Threats from hackers are as valid as ever, but the larger threat comes from accidental damage to information being processed on that PC — a machine with no inbuilt protection of any sort.Tom Armstrong is technical director of Melbourne-based Distributed Data Processing Pty. Ltd. Tel: 03 615 6711.

PROFESSIONAL

COMPUTINGTHE MAGAZINE OF THE AUSTRALIAN COMPUTER SOCIETY

JUNE FEATURE

Performance measurement, development environments

and CASE.

Editorial and Advertising enquiries ph (03) 520 5555

ACS in VIEW

PRESIDENT'S MESSAGE:

From the President’s guest: Geoff DoberIT WAS most gratifying to see a large audience when the NSW Branch hosted a breakfast to launch the PCP Directory of Endorsed Courses during February.

By now, most members and associates will have received the PCP Directory through the mail.

The PCP scheme has received favour­able comments from many sources, but, not surprisingly, there has been criticism by some.

IT professionals frequently ‘inflict’ change on others as they introduce new systems. It is interesting therefore to see how we react when change is done to us.

How have members reacted to PCP? The Victorian Branch has recently con­ducted a member survey and many members have commented on the scheme. While the vast majority sup­ported the scheme, some of the negative comments are reproduced below.

“Carries no weight with a large num­ber of employers.”

“Not possible in a low budget organisation.”

“I regard it as a stunt by the Branch Committee to promote themselves.”

“Until the ACS is viewed more highly by the business community, I see no value to me to spend time and money becoming a PCP”

“My courses are all in-house and not accredited.”

“I still don’t understand it.”

KNOW DON’T KNOW

KNOw

Competent ConsciouslyIncompetent

D0N

TKNOw

UnconciouslyCompetent

UnconciouslyIncompetent

“You should start recognising courses from manufacturers, e.g. Oracle.”

“Needs promotion to the public like CPA.”

“Can’t afford it.”“What about conferences and semi­

nars not endorsed.”“I don’t need to prove my

professionalism.”It would appear from these comments

that we are just the same as the many to whom we subject our changes. We are not good at reading the documentation as many of these comments are not cor­rect interpretations of the scheme. But it proves something that we knew all along. We have to work hard at making sure that the scheme is properly under­stood and, if necessary, modify it to im­prove it. We accept this challenge and are delighted that the editor of Profes­sional Computing has decided to start a regular PCP section from the May issue. Send your feedback forms with com­ments from the Directory of endorsed courses and we will include your views in the PCP section.

At the launch, I explained the PCP scheme with the aid of the following diagram.

Self awareness is on the vertical axis and knowledge on the horizontal axis. It reads as follows:

If you know you know, then you are competent.

If you know you don’t know, you are consciously incompetent.

If you don’t know you know, you are unconsciously competent.

If you don’t know you don’t know then you are unconsciously incompetent (or unconscious!)

The PCP scheme is our conscious at­tempt to encourage professional mem­bers of the ACS to maintain and extend their competency so that our profession can make a significant contribution to Australia’s future.

I commend the PCP scheme to you. I will be consciously seeking to enhance my own competence and in so doing accrue the 30 hours needed to achieve PCP status.

PROFESSIONAL COMPUTING, APRIL 1991 13

Page 16: PROFESSIONAL COMPUTING

Security in Information Systems ACS in VIEW

THE activities of Technical Committee 11 (Security and Protection in Informa­tion Processing Systems) have grown over the years since its formation in the early 1980s. However, before going into the structure of the TC it may be worth­while to consider some major events in computer security for 1990.

In general, the growth of worldwide awareness of the need for control, man­agement, performance and security (CMPS) in information systems contin­ued during 1990 and appears set to take a major role in the 1990s. Security con­cerns encompassed both computer sys­tems, including PCs and workstations, and associated data communications networks.

From the publication of the so-called European “White Book”, Information Technology Security Evaluation Criteria (ITSEC), to a major report by the Na­tional Research Council in the USA, to the enactment of the United Kingdom’s “Computer Misuse Act 1990”, to the growth of so-called “stealth” computer viruses, activity developed on many fronts. In particular, one of the hall­marks of a computer professional is now being seen as a sense of and dedication to the CMPS aspects of the information systems that they create and/or support. On the academic/education front, com­puter security studies entered the main­stream of computer science and infor­mation systems education at a number of locations throughout the world.

These along with other aspects of in­formation systems security are the con­cern of IFIP’s Technical Committee 11 (Security and Protection in Information Processing Systems) in which Australia is represented.USA National Research Council urges $15m Computer Security Foundation

Of major note during 1990 was the report by the National Research Council (NRC) of the USA that noted the “fail­ure of government and industry to pro­tect computer security” (The Asian Wall Street Journal, 7 December 1990). To combat this situation the NRC suggest­ed the formation of a private “founda­tion” with an annual budget of around $15 to $20 million (US) to coordinate and oversee activities in the computer security area. The foundation would, in particular, develop and circulate securi­ty standards, evaluate systems and maintain a catalogue of computer mis­use cases. Once again the report links

Professor William (Bill) J. Caelli, FACS is the Director of the Information Securi­ty Research Centre at the Queensland University of Technology and Technical Director of Eracom Pty Ltd, a company he founded in 1979. Professor Caelli is the chairman of IFIP’s TC 11 for ‘Secu­rity and Protection in Information Pro­cessing Systems.'

He has had over 25 years experience in the computer industry starting with Australia’s largest company, BHP, later with the ANU, Hewlett-Packard and Control Data, with positions in research, consultancy and marketing. He was made a Fellow of the ACS in 1982 and in 1986 was awarded theAITA award for achievement in the Information Tech­nology Industry. He is also a member of the ACM and IEEE of the USA.

computer security (privacy, integrity and authenticity) with related problems of reliability, system “safety”, and the like. This emphasises that discipline of information security is rapidly starting to work in with the broad area of “soft­ware engineering”. The interesting point about this proposal is that it avoids the perceived problem of government in­volvement in the commercial aspects of the problem as well as the reported “competition” between two government groups in the USA with computer secu­rity interests, the National Security Agency (NSA) and the National Insti­tute for Standards and Technology (NIST).“White” book versus “Orange” book

During 1990 four members of the Eu­ropean Community, Germany, France,

The Netherlands and the United King­dom, released a set of proposed stan­dards for the evaluation of the security aspects of computer systems, under ’.he broad title of the “Information Technol­ogy Security Evaluation Criteria” or ITSEC. This suggested standard built upon the earlier work done in the USA in the same area, that resulted in the publication of what has become known as the “rainbow series” of reports on computer security evaluation. The most famous of these is the so-called “Orange Book”, named after the color of its cov­ers, more accurately known as the “Trusted Computer Systems Evaluation Criteria” or TCSEC, for short. This lat­ter book, and, the other members of the series, were essentially published by the United States Department of Defence. (Note: These books have also been re­cently published by Macmillan in the United Kingdom).

The significant point is that the “books” do not exactly match, and com­puter manufacturers and service provid­ers, many based in the United States, are concerned over the presence of two separate sets of criteria by which their computer hardware and, particularly, their operating systems, may be judged. The important point is that the cost of performing the evaluation of security in the system is very large!

European manufacturers in turn claim that they cannot be subject to US requirements. They also appeared to be “unhappy” with the “rainbow series” and wanted security criteria more close­ly matched to commercial, rather than military, needs. The general feeling ap­pears to be that there is now a real need for international agreement in this area, particularly since these same US compa­nies see a substantial proportion of their business (often more than 50 per cent) coming from Europe. Meanwhile, par­ticularly in the USA, security compli­ance with the “Orange Book” is being seen as a mandatory requirement for government purchasing of computer systems by 1992 (the so-called C2-by-92 program).

From the viewpoint of Australia and South-East Asia this important activity may appear to have little significance since, essentially, no computer systems are fully manufactured here. By this I mean that the R&D, system software design and implementation (eg PC/MS- DOS, OS/2, UNIX, etc.), and so on as

14 PROFESSIONAL COMPUTING, APRIL 1991

Page 17: PROFESSIONAL COMPUTING

ACS in VIEWwell as the intellectual property are not vested in the area. Thus a full computer security validation centre (along the lines of the USA’s National Computer Security Centre of NCSC) or similar could have little purpose besides being very expensive.

However, foreign manufacturers and importers/assemblers will still use the appropriate criteria in describing their systems, eg our XXXX system has “B2” compliance, our YYYY system meets “F2,E2” of “White Book”, etc. Thus it becomes vital that the southern hemi­sphere has knowledge of these criteria and techniques and be able to check the claims made by overseas suppliers. This could represent an opportunity for col­laborative effort with our neighboring countries, along the lines of the need for independent testing facilities for compli­ance with Open Systems Interconnec­tion (OSI) specifications. In particular, a leadership role could be possible in rela­tion to small and personal computer sys­tems.

After all, recent figures would seem to indicate that the Asia-Pacific region will have the highest growth rate in the in­formation technology market over the next few years and that already the per­sonal computer and workstation sector of the overall world IT market accounts for well over 50 per cent of that market.

Computer misuse act — United Kingdom

There is now a growing body of com­puter security and related privacy legis­lation developing throughout the world. In 1990, the United Kingdom enacted its “Computer Misuse Act 1990” which complements the earlier “Data Protec­tion Act”. The Act sets out provisions for “securing computer material against unauthorised access or modification; and for connected purposes”. This Act particularly addresses unlawful “access”

to the computer. However, to be guilty a person must “know” at the time of illicit access that such access is “unauthor­ised”. The Act may also help to convict anyone found implanting “computer virus” code in the system since a specif­ic section of the Act (part 3.1) makes a person guilty of an offence if any action “causes an unauthorised modification of the contents of any computer.”

IFIP TC-11IFIP’s (International Federation for

Information Processing) Technical Committee 11 has been in operation now for over seven years and has spon­sored a series of important international conferences, the IFIP/SEC series. Mem­bers of the Technical Committee are nominated by the member societies of IFIP, such as Australia through the Aus­tralian Computer Society (ACS). The full committee meets once a year, usual­ly in association with the worldwide IFIP/SEC conference. The next meeting will be in Brighton, England in associa­tion with the IFIP/Sec’92 conference, entitled “Creating Confidence in Infor­mation Processing” (15-17 May, 1991). The 1990 meeting was held in Helsinki, Finland in May 1990 in conjunction with the IFIP/SEC’90 conference and exhibition. About 300 people attended from 29 countries. The theme of that conference was “Computer Security and Information Integrity in our Changing World” and the proceedings of the con­ference will be soon available from IFIP’s publisher, Elsevier/North-Hol- land.

The Technical Committee has a num­ber of Working Groups (WGs) that meet to consider individual aspects of the overall problem of the security of infor­mation systems. These Groups are currently:

WG 11.1 Security ManagementWG 11.2 Office Automation Security

WG 11.3 Database Security WG 11.4 Crypto Management WG 11.5 Systems integrity and

controlWG 11.6 *** Discontinued ***WG 11.7 Computer Security Law Computer professionals active in the

appropriate fields are invited to take an active part in the working groups and to contribute to the activities of the WGs on an international basis. Working groups also aim to once a year, normally in association with a limited conference or workshop. Further information on IFIP TC-11 is available from:

Professor Bill Caelli, FACS Information Security Research Centre Queensland University of Technology GPO Box 2434, BRISBANE QLD 4001 Tel: 07 864 2752 Fax: 07 221 2384 e-mail: w.caelli @ qut.edu.au

ACS ExaminationsThe next ACS examination in Comput­ing’ will be held on 23 and 30 June 1991. Successful examination candidates are eligible for entry into the professional grades of membership of the Society.

The papers being offered are:23 June:BASIC COMPUTER CONCEPTS and PROGRAMMING TECHNIQUES

30 June:SYSTEMS ANALYSIS and DESIGN and DATA MANAGEMENT and DATA COMMUNICATIONS

Interested people should contact their local branch of the ACS or Donna Ed­wards at the National Office on 02 211 5855 for an application form.

The Australian Computer Society

Office bearersPresident: Alan Underwood. Vice-presidents: Peter Murton, Geoff Dober. Immediate past president: John Goddard. National treasurer: Glen Heinrich. Chief executive officer: Ashley Goldsworthy.PO Box 319. Darlinghurst NSW 2010. Telephone (02) 211 5855.Fax (02)281 1208.

Peter Isaacson Publications(Incorporated in Victoria)

PROFESSIONAL COMPUTINGEditor: Tony Blackmore. Editor-in-chief: Peter Isaacson. Publisher: Susan Coleman. Advertising coordinator: Linda Kavan. Subscriptions: Jo Anne Blrtles. Director of the Publications Board: John Hughes.

Subscriptions, orders, editorial, correspondenceProfessional Computing, 45-50 Porter St, Prahran, Victoria, 3181. Telephone (03)520 5555. Telex 30880. Fax (03)510 3489.

AdvertisingNational sales manager: Peter Dwyer.Professional Computing, an official publication of the Australian Computer Society Incorporated, is published by ACS/PI Publications, 45-50 Porter Street, Prahran, Victoria, 3181.

Opinions expressed by authors in Professional Computing are not necessarily those of the ACS or Peter Isaacson Publications.

While every care will be taken, the publishers cannot accept responsibility for articles and photographs submitted for publication.

The annual subscription is $50.

PROFESSIONAL COMPUTING, APRIL 1991 15

Page 18: PROFESSIONAL COMPUTING

FINANCECommercial Lease/HP/Mortgage

Operating Lease or RentalPCS, MINIS AND MAINFRAMES — NEW & 2ND HAND, GOVERNMENT & SEMI-GOVERNMENT AUTHORITIES;

PRIVATE & PUBLIC COMPANIES.

Contact:— Mr Bill Purton(018) 373 490 or (016) 375 411 Money Resources (Vic) Pty Ltd

Licensed finance brokers

OPEN SYSTEMS

Macpherson Open Systems — Open System Specialists Unix, Pick, Project Development

Expertise available inProgress and System BuilderSydney (02) 416 2788 Melbourne (03) 866 1177Greg MacPherson Ashley Isserow

HARDWARE SUNS +BUY, SELL, RENT,

ALL BRANDS. “COMPUTERS ARE

CHEAPER THE SECOND TIME AROUND’’

CALL US NOW (02) 949 7144^ m§ iff* COMPUTER RESELLERS

international

Unit 25, Balgowtah Business Path, 28 Roseberry Street, Baigowtah NSW 2093 FAX (02) 9494419

o Workstation Traders

Full Range of Disk sub-systems

100MB to 2GB ; SCSI, SMD, IPI

♦ approved Second User

& Rental Organization

SUn© (02)906-6229microsystems v '

Advertising conditionsAdvertising accepted for publication in Professional Com­puting is subject to the conditions set out in their rate cards, and the rules applicable to advertising laid down from time by the Media Council of Australia. Every adver­tisement is subject to the publisher’s approval. No respon­sibility is taken for any loss due to the failure of an adver­tisement to appear according to instructions.The positioning or placing of an advertisement within the accepted classifications is at the discretion of Professional Computing except where specially instructed and agreed upon by the publisher.Rates are based on the understanding that the monetary level ordered is used within the period of the order. Maxi­mum period of any order is one year. Should an advertiser

fail to use the total monetary level ordered, the rate will be amended to coincide with the amount of space used. The word "advertisement" will be used on copy which in the opinion of the publisher, resembles editorial matter.The above information is subject to change, without notifi­cation, at the discretion of the publisher.

Warranty andindemnity

ADVERTISERS and/or advertising agencies upon and by lodging material with the publisher for publication or auth­

orising or approving of the publication of any material, INDEMNIFY the publisher, its servants and agents against all liability claims or proceedings whatsoever arising from the publication, and without limiting the generality of the foregoing to indemnify each of them in relation to defama­tion, slander of title, breach of copyright, infringement of trademarks or names of publication titles, unfair competi­tion or trade practices, royalties or violation of rights or privacy AND WARRANT that the material complies with all relevant laws and regulations and that its publication will not give rise to any rights against or liabilities in the pub­lisher, its servants or agents and, in particular, that nothing therein is capable of being misleading or deceptive or otherwise in breach of Part V of the Trade Practices Act 1974.

Bookings - Tel: (03) 520 5555, Fax: (03) 521 364716 PROFESSIONAL COMPUTING, APRIL 1991

Page 19: PROFESSIONAL COMPUTING
Page 20: PROFESSIONAL COMPUTING

imam 3 s [

ii ft,

MICRO UPS (SERIES 45) 2 sizes of basic UPS for stand-alone PC's and small network servers. Protects and provides back­up for 5-20 minutes....................from $889*

MINI UPS (SERIES 75) Available in 0.5,1,1.5 & 2KVA capacities and various back-up times. Provides mains isolation for total security plus a standard network interface.............. from $2565*

SIDEKICK PLUS UPS - i.skvatransfer UPS with 7-18 minutes integral bat­teries, networking interface and optional RS232 comms......................... from $3200*

INTERACT UPS - 3 or 5kVA no break UPS with 10 minute integral bat­teries, RS232 comms and network or AS400 interface................... from $9875*

Available from all leading computer stores, or....

COMPUTER GRADE POWERCustomer Service Centres located at....

Melbourne SydneyPh : 03-706 5662 Ph : 02-949 6000 Fax: 03-794 9150 Fax: 02-907 9802* Recommended Retail Price (including Sales Tax).

BrisbanePh : 07-841 0122 Fax: 07-841 0139

AdelaidePh : 08-347 3622 Fax: 08-234 0339

PerthPh : 09-445 2500 Fax: 09-244 2674

i■Besrv.* 7^5® cCT'fST £