Path: utzoo!utgpu!water!watmath!clyde!att!osu-cis!tut.cis.ohio-state.edu!rutgers!iuvax!bobmon
From: bobmon@iuvax.cs.indiana.edu (RAMontante)
Newsgroups: comp.binaries.ibm.pc.d
Subject: Re: A Dumb Idea
Message-ID: <11633@iuvax.cs.indiana.edu>
Date: 14 Aug 88 16:43:41 GMT
References: <17362@gatech.edu> <581@rtg.cme-durer.ARPA>
Reply-To: bobmon@iuvax.UUCP (RAMontante)
Organization: malkaryotic
Lines: 18

Since "this machine's" ULTRIX allocates disk blocks 1K at a time, I
average about 500 bytes of internal frag. loss per file, regardless of
compression.  If I compress a file less than 1K, there is no change
(compress knows it made a smaller file, but doesn't know that the same
block is allocated).  With many smallish files this can be a problem;
the recent Omega game posting is an example.

So here's my sick solution:  I compressed all the files.  Then I used the
UNIX-based ARC to jam them all into an archive.  ARC didn't do any
compression, since compress had already done a superior job, but it did
flush all that internal fragmentation.  Plus supplying file-by-file CRCs,
allowing individual extraction or updating, maintaining an original-date
directory.... BTW running ARC on the original files produce a noticeably
larger archive; compress really did do a much better job, even after ARC
collected everything together and added its own internal info.
-- 
--    bob,mon			(bobmon@iuvax.cs.indiana.edu)
--    "Aristotle was not Belgian..."	- Wanda