Path: utzoo!utgpu!watmath!att!tut.cis.ohio-state.edu!unmvax!gatech!prism!loligo!pepke From: pepke@loligo.cc.fsu.edu (Eric Pepke) Newsgroups: comp.sys.mac Subject: Re: Help Me! Mac II -> NTSC (repost) Keywords: video,rgb,II Message-ID: <257@loligo.cc.fsu.edu> Date: 17 Aug 89 13:06:43 GMT References: <475@mit-amt.MEDIA.MIT.EDU> <604@brazos.Rice.edu> <7386@microsoft.UUCP> Reply-To: pepke@loligo.UUCP (Eric Pepke) Organization: Supercomputer Computations Research Institute Lines: 63 In article <7386@microsoft.UUCP> georgeh@microsoft.UUCP (George Hu) writes: > >Yes, the MacII -> NTSC is quite real, although it is far from perfect. >I have built the cable, and successfully gotten it to work. >I tried it both with a multiscanning monitor that accepts NTSC >and a regular VCR. On both, I had the following problems: > >1) It is too large for the screen, so the menu bar and other edges are cut off. >(This was mentioned in another MacWeek article) >2) Everything is quite blurry -- regular text is out of the question. >3) You see double images. >4) The image is in B&W only. (It's supposed to be that way) It's soapbox time for what little I know about video: 1) is due to the overscanning of TV's. NTSC has 525 lines per frame. Some are stolen for vertical retrace, and the remainder scan an area that is a bit larger than the visible area of the tube. This was decided upon a long time ago, because the high frequencies involved in retracing caused bad artifacts near the edge of the screen. 2) is due to a couple of factors. Your cables may have bad impedence and cause some ringing, but even if the cables were perfect, there would still be some problems. NTSC can only guarantee about 300 pixels per line for gray scale, fewer for color depending on the color. Remember that NTSC is a massive kludge which had to make signals compatible with already assigned bandwidth and TV's that were built in the forties. The bandwidth never used to be a problem because the analog devices that were used to produce the signal (tubes) were inherently slow and had inherent low-pass filtering. Now we have transistors and current feedback loops and stuff, and the frequencies are much higher. Studios now use things called comb filters to smooth out the signals a bit before broadcast. Many of the better industrial RGB to NTSC converters also have comb filters. Another problem, flicker, is due to the fact that NTSC is interlaced, so alternate lines are refreshed 1/30 sec. apart. If one line is black and the next line is white, the eye sees flicker. Again, in normal video this is seldom a problem, as home-quality vidicon-based cameras can only resolve about 200 lines, and although studios usually use higher-quality image orthicon cameras, they do filtering. You can effectively do the filtering in software, but it takes a lot of work. This is a good application for fuzzy fonts. As an aside, the reason that TV's scan at the same rate as the line frequency is not because they derive their scan rate from the line, as some people believe. Back a long time ago, it was impossible completely to keep the line frequency from getting through the power supply and causing problems. If the frame frequency is close to the line frequency, one might see bands slowly crawling up or down the screen, while if it is different, one might see bands quickly flickering across the screen. The former is less distracting. By the way, has anybody tried Apple's hack running the color signals through an RGB-NTSC filter? Eric Pepke INTERNET: pepke@gw.scri.fsu.edu Supercomputer Computations Research Institute MFENET: pepke@fsu Florida State University SPAN: scri::pepke Tallahassee, FL 32306-4052 BITNET: pepke@fsu Disclaimer: My employers seldom even LISTEN to my opinions. Meta-disclaimer: Any society that needs disclaimers has too many lawyers.