Advertisement

Low resolution...

Started by June 22, 2010 06:10 PM
1 comment, last by Ravyne 14 years, 4 months ago
I have a question about graphics cards with TV-Out. In general can they display super low resolutions like 256x224? Or is there a set list of resolutions that they can support? For example, if I were to run emulators (NOT with pirated games) on an ATI Rage Mobility in the emulated systems' native resolutions would it appear correctly on the TV, or would it scale up to 640x480 like my current card does with my particular monitor? My reason for asking is that I hope to build a tiny ARM based computer with an ATI Rage Mobility or something similar to emulate old console more closely than a typical modern PC. Any help is appreciated.
You might save yourself some time and headache with one of these; But apart from that - the card manufacturer would be able to tell you the supported resolutions. I doubt you'll find anything that can do 256x224; most cards with S-Video out are restricted to the 640x480..1024x768 range.
Advertisement
Televisions have a fixed vertical resolution of 480 pixels, but that's interlaced -- which means that on one frame the odd scanlines are drawn, and on the next the even scanlines are drawn. Frames appear at a rate of 60hz, yielding 30 odd and 30 even frames per second. On each frame, there is some overscan, along with the syncing signal, so in practice you generally get 224 horizontal lines per frame.

Horizontally, you can basically feed color data to the monitor as fast as you are able for as long as the television can keep up. 512 pixels was done fairly commonly, its possible that you might get even more, I don't know.

As for graphics cards and TV output, generally the card will render to an internal resolution that has nothing to do with the television's native resolution (though in practice, most cards limit the internal resolution to 640x480 or 800x600 when TV-out is enabled, any higher and you'd loose too much detail when its scaled down for TV output). Then there's a separate scaling process which brings that internal resolution down to television resolution and feeds it to the television encoder -- televisions work much differently than a PC monitor in terms of signal -- chroma/luma encoding (S-Video), Composite (chroma/luma over one wire) or (gasp!) RF modulation (chroma/luma/sound over one wire), whereas an analog PC signal (VGA style) has one wire for each color component Red, Green and Blue.


As for hooking up a PC video card to an ARM chip, its certainly possible and an interesting project on its own, but as a means to an end its far more difficult than its worth. Not only do you have to figure out how to wire it up electrically and whether the ARM chip can keep up, but you'd also end up writing a driver for the thing from scratch and, depending on how much public knowlege of that particular GPU there is, likely needing to reverse-engineer the thing in whole or in part.


There are a couple options if this is about getting native arcade resolutions out of an emulator:

Easiest: Select a sufficiently-high resolution monitor and turn on one of the Scale algorithms in your emulator's settings (Scale2x/3x/4x) -- it won't look like the real thing, but it does look pretty good. Some emulators even have filters which simulate the wide inter-pixel gap that most people associate with the classic arcade "look".

Easy: There was a supplier of video cards with custom BIOS modifications that would output native arcade resolutions. I believe the earlier model was a PCI-based Radeon 9250, and the newer is a PCIe-based radeon x700 or so. Google ArcadeVGA.

Harder: Under Linux you can generally define custom video modes (or at least, attempt to) and this could be done in such a way as to target actual arcade monitors -- however doing so has at least some degree of liklihood that you will damage your monitor and possibly the video card itself.

Hardest: Rather than wiring up a video card to an ARM chip, consider getting a simple NTSC television encoder chip (about $10, Analog Devices makes a bunch) and driving that from an R2R ladder which is in turn driven directly from the ARM's I/O pins (I presume we're talking about a microcontroller here.) Since you're generating the raster signal in real-time, you tie up the CPU for about 92% of its time but, depending on the frequency of the CPU, you should get 14 - 30 cycles per pixel which is generally a lot of time to do clever video stuff (like grabing a bunch of pixels from different planes and compositing them with priority) but not enough time to bother context switching back and forth to do other work. You can preprocess the next scanline and spit out audio on the h-blank to give you something useful to do. Now, being left with 8% processor doesn't sound too appealing, but you'd be amazed how much power that actually is once rendering and audio are taken out of the equation -- Google for Uzebox to see a system that works like that and achieves SNES-like graphics from a 8bit, 28Mhz microcontroller with 4K of ram. If you're careful in your chip selection, a DMA with programmable delay would allow you to render an entire scanline at a time and use DMA to send it to the encoder, so you'd essentially get a one-line double-buffering mechanism (which is exactly how the Atari 2600 worked behind the scenes).

throw table_exception("(? ???)? ? ???");

This topic is closed to new replies.

Advertisement