Newsgroups: news.software.nntp Path: utzoo!utgpu!jarvis.csri.toronto.edu!yonge.csri.toronto.edu!neat.ai.toronto.edu!lamy From: lamy@ai.toronto.edu (Jean-Francois Lamy) Subject: Re: NFS questions Message-ID: <88Aug21.131920edt.414@neat.ai.toronto.edu> Organization: Department of Computer Science, University of Toronto Date: Sun, 21 Aug 88 14:39:13 EDT We have something like 80 machines in different groups NFS mount a central news server. On clients, inews simply validates a few things and mails to the central server (it could NNTP it, yes). Pnews has been lobotomized not to deal with moderated newsgroups, the inews on the server does it. Our news clients all have this view of the news stuff (we run Sun 2, 3 and 4s) /local/share/news (aka /usr/lib/news :-) looks as follows: active -> /nfs/news-server/news/lib/active (/usr/lib/news/active on server) inews mailpaths -> /nfs/news-server/news/lib/mailpaths (not used by Pnews/inews) newsgroups -> /nfs/news-server/news/lib/newsgroups organization /local/share/rn (often called /usr/local/lib/news/rn) looks like this: (all machines of a group, regardless of architecture, share this) INIT Pnews.header art.help filexp makedir mbox.saver mbox.saver.dateit -> /local/lib/rn/mbox.saver.dateit (machine dependant code) newsnews ng.help norm.saver organization -> /local/share/news/organization pager.help subs.help /local/lib/rn (machine dependent code -- each machine sees the appropriate one) mbox.saver.dateit rn and the rest of the programs people invoke directly are in /local/bin (each machine sees the appropriate one). The news server runs a locally grown mailer that can easily be made to collect all news articles sent in the last n minutes and create a news batch with them. That news batch is put in a single "lpd" queue, along with whatever UUCP batches are waiting to be unpacked. That way, only one news unbatcher ever runs at any one time. We have not found a satisfactory way yet to get NNTP to create, for each connection, a batch containing all articles not received yet. The problem is to deal with many concurrent NNTP transfers with many duplicates without forking an inews per article to keep the history file up to date... We can't afford forking inews like that... Jean-Francois Lamy lamy@ai.utoronto.ca, uunet!ai.utoronto.ca!lamy AI Group, Department of Computer Science, University of Toronto, Canada M5S 1A4