4

I have been messing around with VGA projects as my latest interest. I have a Xilinx Spartan 3E 250K FPGA, which has just barely too little RAM for a full 640x480 frame buffer. So, I'm looking at making things more "interesting". Namely, by instead going for a vector approach rather than bitmap. However, it's a bit mind boggling as far as how to do it. Are there any known and open source vector based VGA graphics cards?

I'm not going to be using this in any kind of "production" scenario, so I don't really care if it can only effectively render at 1fps or something. I'm just looking at an interesting project idea.

Anindo Ghosh
  • 50,188
  • 8
  • 103
  • 200
Earlz
  • 3,346
  • 15
  • 46
  • 66
  • If you're not aware of them http://www.fpgaarcade.com and http://opencores.org/ are both worth a look for ideas. I think you need to sign-up for the second one, but they don't send out spam or anything. – PeterJ Apr 25 '13 at 02:55

1 Answers1

8

Please, please, please don't take what I'm about to say personally. I think your question is one that a lot of people have probably wondered about at one point or another. I even up-voted the question. Unfortunately, the answer is "it doesn't work that way". And there is no good way to answer your question without possibly making you feel bad for asking it-- which is a shame because I think that there is useful info in the answer. So, bear with me on this and understand that my motivation is to explain how this stuff works and not to make you feel bad.

Modern displays (TV's, LCD's, Plasma, most CRT's) are raster based devices. Meaning, they redraw the screen one scan line at a time. The video interfaces used to talk to these displays are based around a raster interface. NTSC, VGA, HDMI, etc. are all raster display technologies.

Years ago, mostly in the 70's and early 80's, there were some true vector displays. Instead of drawing things a scan line at a time they actually traced out the shape of the objects. The best examples of this are analog o-scopes, and the arcade games Asteroids and Battlezone. There are very few color vector displays that were used, and the best example of that is the arcade game Tempest. A different form of vector display is the systems used for laser light shows.

The electrical interface to a vector display has signals for the X and Y location of the beam, the intensity, and sometimes the color. This is very different than the interface for a raster display.

Older CRT technology could be used for either vector or raster displays, but LCD's, Plasma screens, OLED's, etc are all fundamentally raster based and cannot easily be used in a vector type mode.

The problem with your question is that you are asking about a "vector based VGA graphics card". "Vector Based" does not go with VGA. VGA is a raster based display standard and interface, while vector is not. You cannot mix the two, since they are different systems.

It is possible to make a vector graphics card based on an FPGA, but you can't connect it to a VGA type monitor without doing a vector to raster conversion in the FPGA-- and that requires as much RAM or more than a standard raster graphics card. You could get a vector display, but those have all but gone away with the demise of the CRT.

The easiest way to get a vector display these days is to get an o-scope that has an X/Y mode. In fact, there are many projects on the web that use an old o-scope with CRT tube as the display for something. Here is a project that uses one to make a clock. And here is another one. There are dozens of other similar projects on the web.

While interesting, these projects are little more than novelties. They are cool novelties, but just novelties. And none of them approach the display quality of a normal VGA card.

An alternative solution is to simply get an FPGA with some external RAM. There are many ways to do this, but the easiest and most pain-free way is to use a Xilinx Spartan-6 based board with external DDR2 SDRAM. There are several FPGA development boards on the market that have both of these chips that will work. Some of them even have VGA interfaces too.

I would not use external SRAM. SRAM, especially async-SRAM, is going to be slow to interface too and the memory size will be limited. It's not impossible to use SRAM, but it is not any easier to use SRAM than DDR2 SDRAM on a Spartan-6.

I would also not use a Spartan-3. The S3's DDR SDRAM interface is not very good, and is difficult to get the signal timing to work out reliably. The Spartan-6 has a "hard macro" for the DDR2 interface, which makes the whole thing much easier. The S6 also has more internal RAM for buffers, FIFO's, etc. Xilinx has a nice "Memory Interface Generator" core that makes interfacing several different chunks of logic to the DDR2 SDRAM much easier (with multiple read/write ports too).

As for an open source Vector Graphics core-- I do not know of one. That doesn't mean there isn't one, just that I have not seen one. But I also doubt that you'll see one that is very good. There just isn't that much call for one since the displays are somewhat rare and limited. If you do find one, it will likely be fairly specialized (it only displays a clock).

Get an FPGA board with a Spartan-6, DDR2 SDRAM, and a VGA port and you'll be much happier.

  • Wasn't Tempest a black & white display with a translucent decal on the screen to make it look like multicolor? – The Photon Apr 25 '13 at 03:48
  • @ThePhoton No. There were some versions of Battlezone like that, but not Tempest. If you look at pics of Tempest, you'll see that there were multi-color "enemy" moving around on the screen in ways that would not possibly work with a fixed translucent decal. –  Apr 25 '13 at 03:51
  • Few notes: 1. Spartan 3 is just what I have on hand. I won't be upgrading for a while. 2. I don't have any SDRAM so I'm constrained to BRAMs :( 3. My point about a "vector" display is drawing directly as the "beam scans" so to speak and not having a framebuffer. I know it's not a real vector display, but it sounds like an interesting thing to attempt – Earlz Apr 25 '13 at 04:53
  • @Earlz You won't have enough processing speed to handle basically drawing things while the beam is scanning. If you think about it, you have a limited number of clocks per pixel, and a variable number of lines in memory. For every pixel you need to check every vector for an intersection. The max number of vectors will be the number of clocks per pixel. VGA has a pixel clock of about 25 MHz. If your main clock is 125 MHz then you have 5 clocks/pixel. So you can have 5 vectors in your buffer at any given time. That's not very interesting to look at. –  Apr 25 '13 at 12:57
  • @David Kessner is that true with an FPGA though? Presumably you'd build one "circuit" to scan over each pixel on the screen and then one "circuit" to hold each vector and that then each vector would in parallel decide if the currently scanned pixel should be lit up. So the limit is how many vector location comparators you can fit on the fpga, nothing to do with clock cycles. – John Burton Apr 25 '13 at 13:43
  • @JohnBurton Here is the problem: You can't fit enough of these circuits into a Spartan-3 to do a useful number of vectors. If you are lucky, you will get the total number of vectors up to around a hundred. The digits 0-9, on avg, require 5 vectors to draw. So you could put maybe 20 digits on the screen at any one time. That isn't useful. –  Apr 25 '13 at 14:16
  • Ok that makes sense. I would have expected you could have got more than that but I guess they are quite "wide" in terms of number of bits so you are right. – John Burton Apr 25 '13 at 14:18
  • Good points. I didn't expect for it to be capable of any huge feats, but that is rather limited – Earlz Apr 25 '13 at 14:52
  • @DavidKessner: You're thinking in a very brute-force fashion. Not every vector is going to intersect each scan line, so if you keep track of the vector bounding boxes, you should be able to handle "around a hundred" vectors *per scan line*. And there are lots of ways to put a lot of information on a VGA display without a full graphics frame buffer ... with a two-level memory architecture, you could do a basic character generator for (lots of) text, or a sprite-based graphics system. – Dave Tweed Apr 25 '13 at 16:14
  • @DaveTweed There are lots of ways to optimize things, but there will always be limitations. To properly keep track of vector bounding boxes, you need to organize the vector memory in such a way that you know a priori which vectors intersect the current scan line. In the worst case scenario, this memory is actually bigger than the equivalent normal VGA frame buffer. The best way, in my opinion, to minimize BRAM requirements is to use a character generator + Sprites. The Char RAM+ROM requires 32kbits, leaving plenty of room for sprites. –  Apr 25 '13 at 17:14
  • I once looked at using a run length encoded frame buffer for a little hobby thing that never went anywhere. For the screens I wanted I calculated that I could store enough pixels. The problem was that it was VERY hard to update the display specially in a FPGA system without a CPU which I wanted so isn't practical but there might be some compression scheme you could use? – John Burton Apr 26 '13 at 09:37