Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/18/84; site lanl.ARPA
Path: utzoo!watmath!clyde!bonnie!akgua!whuxlm!harpo!decvax!ucbvax!ucdavis!lll-crg!gymble!umcp-cs!seismo!cmcl2!lanl!crs
From: crs@lanl.ARPA
Newsgroups: net.micro.amiga
Subject: Re: Interlaced monitor
Message-ID: <30637@lanl.ARPA>
Date: Fri, 13-Sep-85 12:20:00 EDT
Article-I.D.: lanl.30637
Posted: Fri Sep 13 12:20:00 1985
Date-Received: Mon, 16-Sep-85 00:02:36 EDT
References: <6789@ucla-cs.ARPA> <204@cirl.UUCP>
Organization: Los Alamos National Laboratory
Lines: 39

> 	In theory, all monitors should be capable of interlace. During
> interlace mode, only half the scan lines are drawn for a particular
> sweep. The other scan lines are drawn during the next sweep. In the
> AMIGA, 400 line resolution can be achieved this way. The price one pays
> is flicker, since screen sweeps are only done 30 times a second, as
> opposed to 60 times a second during non-interlace. Thus a higher
> persistance monitor is needed to eliminate flicker.

Is interlace *that* different in a video monitor than it is in a TV?

The reason that interlace is used in TV is to *reduce* flicker.
Because of limited bandwidth, a full TV image can only be produced
once every 33.33 millisecond (ie 1/30 of a second).  At this scanning
rate, a given area of the screen is illuminated only 30 times per
second and there would be a noticeable top to bottom moving flicker.
By using interlace scanning, *every other* scanning line is scanned
from top to bottom in 1/60 of a second, then the "missing" lines are
filled in during the second "field" of the frame.  Thus, the entire
screen is illuminated from top to bottom 60 times per second rather
than 30 and *apparent* flicker is reduced because no *large area* is
left unexcited for more than 16.67 ms which is accomodated by
persistance of vision.  While it still takes 1/30 of a second to
produce a full picture or "frame" the entire screen is scanned 60
times per second by breaking the frame up into two interlaced fields.

I system bandwidth is adequate, the entire image or frame could be
scanned 60 times per second *noninterlaced* and the synchronizing
system would be considerably simplified (especially the sync generator
at the source) but it is questionable (to me, at least) if (apparent)
flicker would be less.

What is different in the case of a video display monitor?

-- 
All opinions are mine alone...

Charlie Sorsby
...!{cmcl2,ihnp4,...}!lanl!crs
crs@lanl.arpa