Path: utzoo!attcan!utgpu!jarvis.csri.toronto.edu!rutgers!psuvax1!news
From: flee@shire.cs.psu.edu (Felix Lee)
Newsgroups: news.software.b
Subject: Re: NNTP vs Cnews (was: Re: Cnews is not for me)
Message-ID: 
Date: 13 Aug 89 17:13:10 GMT
References: <2828@ndsuvax.UUCP> <1989Aug12.221624.12153@utstat.uucp>
	<1894@ucsd.EDU> <1895@ucsd.EDU>
Sender: news@psuvax1.cs.psu.edu
Distribution: usa
Organization: Penn State University Computer Science
Lines: 17
In-reply-to: brian@ucsd.EDU's message of 13 Aug 89 06:06:48 GMT

In <1895@ucsd.EDU>,
   brian@ucsd.EDU (Brian Kantor) writes:
> If we were to batch those articles and then process them
> periodically, we would have wasted significant amounts of network
> resources transferring duplicate articles.

We're running C News with NNTP feeds from rutgers, ukma, gatech, and
husc6.  For the last day or so, we've received 36 duplicates out of
1800 articles.  This seems acceptably small to me.  Running news on a
fast machine (a Sun 4/280) probably helps the matter.

If NNTP were to batch IHAVEs, then duplicates may become silly.  Hmm.
How about have nntpd lock Message-IDs from the time a sendme is
requested until the batch is processed by rnews?  Like the L
files that B News used to create.
--
Felix Lee	flee@shire.cs.psu.edu	*!psuvax1!flee