If I remember correctly, the VGA sync outputs switch polarity to instruct antiquated monitors to switch their vertical refresh rates. Thus, each sync wire will be high most of the time in some modes and low most of the time in others. When using a newer monitor with a vintage VGA card, it would probably be possible to change the sync polarity as a means of controlling an LED; newer monitors probably wouldn't care about the sync polarity, though switching it might cause a momentary display glitch.
Alternatively, one could probably construct a circuit using a 556 or similar dual timer, a fast analog comparator, and a 74HC74 or similar dual flip-flop, to latch the state of a particular region of the screen. One of the timers should be set for a delay of around 2-12ms triggered by vertical sync, and one should be set for a delay of 20us or so triggered by the horizontal sync. The second timer should trigger one latch to capture the state of the first timer, and output of that latch should trigger the second latch, which should capture the state of the comparator, which should check the voltage on the R, G, or B pins.
The net effect of the circuit would be to have the state of the LED indicate whether the brightness of a particular "pixel" was higher or lower than a certain threshold. By adjusting the timers, one could control which pixel was output. For reliability, one would probably have to display a rather large rectangle rather than a single pixel, but this approach could easily be extended to work with dozens or hundreds of LEDs.