8

In discussion a friend mentioned:

In the original implementation of PAL and NTSC they used the AC current as a means providing the frequency for the TV. As the different mains had different frequencies, they designed the TV standard to have different frequencies.

I wasn't sure about this so I wanted to check.

My question is: Is the design decision for different frequencies in PAL and NTSC related to the AC mains power frequency?

hawkeye
  • 627
  • 1
  • 5
  • 13

3 Answers3

10

Yes, it is related.

In early TV implementations, it was not easy to remove all of the AC line ripple from the DC power circuits that drove the CRT, and this resulted in a slight variation in intensity from top to bottom. It was found that if the vertical frequency of the TV signal was the same as the power line frequency, these intensity variations would appear in the same location on every vertical sweep, effectively causing them to "stand still" on the screen, and this was much less objectionable than having them drift up or down.

There are also sources of RF noise that are related to the power line frequency, and the visual artifacts caused by that kind of noise also stand still on the screen.

Dave Tweed
  • 168,369
  • 17
  • 228
  • 393
  • 2
    Also, line sync freezes the picture distortion caused by magnetic fields from power conductors near the TV. – tomnexus Apr 11 '15 at 13:47
  • The "philosophy" is now synchronizing frames and lines to main, through a PLL circuitry for EU TV. –  Jun 20 '21 at 13:22
3

PAL and NTSC are colour encoding systems and are not necessarily related to horizontal and vertical scan frequencies.

The choice to make the vertical scan frequencies the same as the local power line frequency was to make the picture disturbance due to poor power supply filtering, and power current magnetic fields less obvious. With the power line frequency and vertical scan frequency the same, any such disturbance would be stationary on the screen, and so would be less noticable than if the disturbance was rolling through the screen, as would happen if the frequencies were different.

Peter Bennett
  • 57,014
  • 1
  • 48
  • 127
  • 1
    Ever since NTSC color broadcasting standards have been adopted, the frame rate in no longer 30 frames (1/2 the line frequency) per second but rather 30/1.001 (approximately 29.97) frames per second, to reduce interference seen on B&W TV's between the color signal and the FM carrier for audio. – tcrosley Apr 12 '15 at 05:09
  • @tcrosley I always assumed the frequency was actually locked to the local line (for example, security cameras do this). Line frequency varies, but there are benefits to being locked during recording, and again being locked during playback on air. I suppose if it's a quartz clock, the beat would be very slow, tens of seconds, so might not be visible anyway. Do you know if it uses Line, or is internally generated now? – tomnexus Apr 12 '15 at 08:22
  • No, they don't use line frequency, because of the difference between 1/29.97 and 1/30 sec per frame. See this article about [SMPTE drop frame timecode](http://en.wikipedia.org/wiki/SMPTE_timecode#Drop_frame_timecode) for more info. It is assumed the clocks for recording and playback are *exactly* the same frequency. In TV studios, timecode is generated by a master sync generator, tied to an atomic clock standard. Portable cameras may use time code generators using temperature-controlled crystals. A new development is to make use of GPS receivers since GPS signals are accurate to ±10 ns. – tcrosley Apr 12 '15 at 09:20
  • *"I always assumed the frequency was actually locked to the local line (for example, security cameras do this)."* I imagine security cameras just ignore the whole drop-frame issue since it was to ensure compatibility between B&W and color broadcasts. I assume security systems are either B&W *or* color and in any case, can run at 30 fps. – tcrosley Apr 12 '15 at 09:34
  • The first TV station I worked at had a master sync generator that did have provision to lock to the AC power line, but that was for monochrome use. Colour sync generators for NTSC were driven by a 14.31818 MHz crystal (four times the colour subcarrier frequency). I wouldn't be surprised if current sync generators are controlled by a GPS-locked frequency standard. I'd expect that colour security/closed circuit cameras would have 14.31818 MHz crystal oscillators (but that's just a guess). – Peter Bennett Apr 12 '15 at 16:32
  • I'd doubt it'd actually lockto the line frequency, other than as a reference for the timing circuitry as a whole. Mains AC frequency isn't absolute, it doesn't maintain rock-solid 50 or 60 hertz, but varies over the course of a day in response to grid load, available generator power etc, and power companies simply try to maintain it within about 1% of nominal instantaneously (if it drifts by more than this, the grid's in trouble, and some UPSes will even trigger protective shutdown in response) and to average an exact 50/60 over 24 hours to keep mains powered clocks in sync day-to-day. – tahrey May 07 '19 at 21:47
  • Whilst that would still mean an integrated national grid could keep broadcast studio and receiver in sync, there are still plenty of reasons to avoid it. Grids are not necessarily entirely nationally synchronised for one thing and could become segmented (and much more so during the early days of TV than at present). The radio propagation delay is long enough compared to the time taken to scan a line to cause ghosting of the image from multipathing and reflections, so it could certainly be enough to desync the picture vs the power line especially as cable propagation is slower than over-the-air – tahrey May 07 '19 at 21:55
  • And whilst it may have been fine for live broadcast, it'd be no good for telecine of newsreels, and later for videotape recordings, where you need to lock the broadcast to whatever rate is coming off the prerecorded media and how fast the telecine machine or tape player (which you'd build to run as close to 50/60Hz as possible, but probably couldn't guarantee it, especially for converting 24fps film in 60Hz areas). – tahrey May 07 '19 at 21:58
  • All these and probably more are why, even if the studio equipment may have been synchronised to its own local AC supply (one particularly strong argument for the sync, rather than simplifying camera/receiver circuitry, is so the cameras would be synchronised to the high powered electric, sometimes arc-based studio lights and prevent strobing), the actual timing at the receiver end is based off sync pulses embedded in the video signal itself that signify not only the end of each frame but each line, as well as what interlace field is next, *resetting* the timing nearly 16000 times per second. – tahrey May 07 '19 at 22:01
2

Dave Tweed's answer is largely correct. But it wasn't just AC ripple on the DC power circuits that caused the variation. The signal cicuits in early TV used tubes (a.k.a. valves). The cathodes usually had a heater filament that was often driven by low voltage AC (typically about 6 V). This caused the temperature of the cathode, and consequently the gain of the tube, to have some variation at twice the power line frequency (the heater power varies with the square of the AC voltage, hence the doubled frequeny).

  • But if that implies a 100 or 120Hz ripple, wouldn't that mean, e.g. a dark band across the middle of each 50 / 60Hz field, and two bright ones near the 1/4 and 3/4 levels (or vice versa... etc)? Still less objectionable than bands rolling / flickering any more rapidly than maybe once or twice per minute, but still very obvious and annoying. Seems like something that'd have to be fixed during the early development and engineering refinement of the receiver (and camera) hardware itself e.g. with an inverted and attenuated copy of the heater drive current to modulate the tube output... – tahrey May 07 '19 at 22:06
  • Like I doubt the idea of feedback driven voltage regulation would have been foreign to engineers even in the early days of electronics. The modern LM semiconductor regulator may miniaturise the necessary circuitry and make its setup easy, but the idea behind it is pretty well-worn. And, besides that, they already had capacitors... ((I mean... I'm not saying it wasn't the case as it sounds like you speak from experience... it just sounds rather unnecessary and avoidable even with tech of the time)) – tahrey May 07 '19 at 22:10
  • 1
    The effect of the ripple on the heater filaments was effectively put through a low-pass filter by the thermal inertia of the heater coil, so it wasn't too bad. – Stephen C. Steel May 07 '19 at 22:14