Path: utzoo!attcan!utgpu!jarvis.csri.toronto.edu!mailrus!wuarchive!gem.mps.ohio-state.edu!uwm.edu!uakari.primate.wisc.edu!polyslo!vlsi3b15!vax1.cc.lehigh.edu!sei.cmu.edu!krvw
From: chinet!ignatz@att.att.com
Newsgroups: comp.virus
Subject: Tiger Teams (Was Re: Good viruses?)
Message-ID: <0015.8909281133.AA14331@ge.sei.cmu.edu>
Date: 27 Sep 89 19:34:37 GMT
Sender: Virus Discussion List 
Lines: 55
Approved: krvw@sei.cmu.edu

In article <0002.8909261721.AA06193@ge.sei.cmu.edu> dmg@retina.mitre.org (David
 Gursky) writes:
			...
>Suppose a company has stringent rules about protecting desktop
>computers from viruses.  How do you go about ensuring the rules are
>being followed?  One thought I had was the user of "Tiger Teams".

And goes on to describe a "Tiger Team" which would prowl the halls
after-hours, looking for unsecured desktop machines which it could
then infect with an "approved" virus, preparatory to an upleasant
visit by the PC Police the next day.

Presumably, the purpose of actually infecting the machine is to
provide an object lesson to the unhappy employee careless enough to
not lock the system.  This, however, is Not A Good Idea, for many
reasons.  First, you've disrupted the productivity of a probably
useful employee for at least half a day, or more, while his/her
machine is zoned out.  Next, you're tying up one or more people
comprising the "Tiger Team"; as proposed, worse, they're having to put
in non-prime hours performing what is essentially an overhead (read
"costs money, makes none") task; you're setting up the kind of
confrontational situation that can cause stressful relations between
employees; and it's not necessary.  Not to mention that there are
other security holes that are unaddressed, such as terminals left
logged into multi-user systems which nevertheless can be used to
corrupt or destroy company data and programs.  Also, how about desktop
or cubicle multi-user and/or multi-tasking systems, such as small
Unix/Xenix boxes, VAX/VMS workstations, etc.?  Look at finding access
to these, and then corrupting them, and you'll start to see that this
is a form of sanctioned cracking which is beneficial to none, and
detrimental to all.

More useful, and actually used in many client sites I've been assigned
to, is to simply have the guard--who must make rounds anyway--also
made responsible for checking certain criteria for computer equipment.
Such things as locked access when applicable, no media left lying
about unattended, login-protected terminals (whether remote
timesharing, desktop multi-task/user, etc.) logged off whenever
unattended, etc. would be grounds for a report by the guard.  At the
same time, the unsafe condition would be corrected as well as possible
by the guard--media collected and secured, accounts either logged off
or reported to system operators for deactivation, unlocked single-user
desktop machines either locked in the office, if possible, or the
power supply secured, etc.  The same desired benefits are obtained:
the employee is made amply aware of his/her faux pas, and security is
maintained.  Anyone who's ever worked in a security environment is
aware of these and other methods; they're actually used, as I
mentioned before.

The military does make use of "Tiger Teams" that attempt to penetrate
security and leave proof of their success.  Usually, however, they are
employed in an environment where they're attempting to subvert or
circumvent active security measures, such as the deck guard on a nuke
sub that's docked, or access to a presumably secured and monitored
area.