Path: utzoo!utgpu!water!watmath!clyde!att!alberta!teletron!andrew
From: andrew@teletron.UUCP (Andrew Scott)
Newsgroups: news.admin
Subject: Re: Unbiased moderator volunteers
Keywords: no,not the source,luke.
Message-ID: <383@teletron.UUCP>
Date: 25 Jun 88 21:32:20 GMT
References: <2805@rpp386.UUCP> <28.UUL1.3#935@aocgl.UUCP> <4542@gryphon.CTS.COM> <8659@netsys.UUCP>
Organization: TeleTronic Communications Ltd., Edmonton, Alta.
Lines: 44

In article <8659@netsys.UUCP>, len@netsys.UUCP (Len Rose) writes:
> In article <379@teletron.UUCP> andrew@teletron.UUCP (Andrew Scott) writes:
> >I'm also in the "binaries must go" camp.  I think that *all* sources and
> >binaries newsgroups (including comp.sources.unix) should be replaced by
> >a comp.sources.announce type newsgroup, where announcements of new source
> >releases could be made, including instructions on how to get them from an
> >archive server. (perhaps uunet?)
> 
> This is what I have a problem with.. Surely not the SOURCE groups.. Source
> is what makes Usenet so worthwhile for most sites. Surely with the reduced
> traffic,source would not be a problem.. 

The reason I included sources with binaries is that a lot of them are machine
specific also.  You'll get no argument from me that sources are probably the
most valuable postings in all of USENET, but they're also large.

For example, many Sun specific sources are posted to comp.sources.unix, such
as the monster Postscript interpreter from last year.  I'll bet that a good
many sites just let it pass through, as they don't have Suns.  With the
addition of 386 machines and other high powered PCs to the net, not every
source posting can be used at every site.

Thus, it seems to make sense to archive them and let individual sites pick
them up if they have use for them.  Surely the costs will be cheaper overall
than transmitting them through every site that carries comp.sources.unix.

While we're at it, the comp.mail.maps postings are also immense.  Perhaps
future news software could have the auto-extraction built in as part of the
code, and only post updates to the maps in a special form:

	%add site newsite.com
	%change link from_a to_b(DEAD)

and so on.  We wouldn't need to post the whole map for a region when a
site makes a few small changes, nor would we have to post context diffs
(which I believe work out to be as large as posting the whole things).

It seems to me that a lot of sites are solving the problem of an increasingly
larger USENET by buying faster modems and installing larger disks.  Doesn't
it make more sense to re-organize the software than having to resort to such
brute-force methods?
-- 
Andrew Scott		andrew@teletron.uucp    - or -
			{codas, ubc-cs, watmath, ..}!alberta!teletron!andrew