I have a lot of old analog video equipment and my trusty Apple IIe Color Monitor is finally dying. Since CRT's aren't produced anymore, and because I just am interested in how it is done, I obtained a decent plasma monitor (I can't get enough of those deep black tones). I want to know if an analog signal can be deinterlaced without resorting to a FPGA or similar overkill.
Can something to get rid of the interlacing effect on non-CRT's be made with components like 74-series IC's and proc amps (and maybe a AVR microcontroller), or do I need to go out and get some specialized chip from some specialized producer? All I want for output is progressive-scan composite video (although a better signal would be acceptable, I would prefer composite).
The most complex digital video generation hardware I have ever worked with was that of a Commodore 64 (converted it to PAL by swapping IC's with a european one, and replaced a few caps). My other experience was just adding a analog filter to a NES system. So digital isn't really my thing. I hate to think this is a dumb question, and hopefully, even if there's no answer, I am not the only one who wants this.