A MATTER OF VISUAL
PRECISION
When
Capcom decided to move its Marvel series' sprites from the
CP-S II games to the soon-to-be-a-standard NAOMI platform five
years ago with its popular Marvel vs. Capcom 2, some realized
that something just wasn't right with the game's graphics.
More than a matter of design (which was also there), there was
a problem in the execution. The game was presented at a high
resolution (640 x 480 pixels), but the sprites weren't
displayed accordingly, since they had been drawn for a much
lower resolution display originally, the one of the old CP-S
II hardware, and they had to be upscaled in order to preserve
the sprites' size and aspect ratio. Even worse, every other
graphic material in the game, including backgrounds, were
actually designed at hi-res, emphazising the denaturalized
sprites' presentation, which had been anti-aliased too in
order to mask the inherent pixelation.
Sadly, this way
of conceiving 2D graphics became common use, not only on
arcade games which made use of pre-drawn sprites, but
also for arcade-to-home ports for that new wave of 128-bit
consoles, and even original games which weren't "ports" at
all. Today, it is odd the case of an originally low-res game which is
displayed at its design resolution, but there actually are more people concerned about the subject, hence a new
term has appeared to name the phenomenon: fake low-res.
So let's try to explain here what is it all about in order to
make it accessible for more people to see the differences by
themselves.
The truth is that Capcom didn't invent the
thing with its MVC2. Some years before, an arcade fighting
game supported by Sega's ST-V system called Astra Super Stars
had already displayed low-res-designed characters at a hi-res
on-screen presentation and with hi-res-designed backgrounds.
Previously, some 2D games for the N64 (Tactics Ogre 64, Yuke!
Yuke! Trouble Makers...) had already used a hi-res display for
low-res-designed graphics. But even before than that, fake
low-res was a common way of graphic presentation of
computer games, such as the arcade ports for Sharp's X 68000
computer. The X 68000 could output true low resolutions at 15
kHz, but, for some reason, some developers chose the 31 kHz
display for their low-res games.
First technicality here - the
monitor's operating speed. Display resolutions depends on the monitor's
operating speed. Basically, a CRT monitor draws a maximum number of
horizontal lines per second (clock speed), and,
according to it, the CRT monitor is classified as a low-res
monitor or a hi-res one (although there also were extended-resolution
and medium-resolution monitors at their time they were
abandoned pretty soon). Given this way of drawing the image,
that is, by horizontal lines, any picture displayed by a CRT
monitor is visually divided into that - horizontal lines,
which implies there's a blank space between every couple of
drawn lines. That space, which takes the form of a thin black
line, is informally called scanline, and, obviously, the
lower the resolution (and the bigger the monitor) is, the more
noticeable the scanlines are. It's important to
understand that the old (and not-so-old) sprite-based
video-games were conceived with that in mind, that is, the
graphic artists did draw the games' graphics knowing that the
picture would get interpolated black lines, hence they were
implemented for this type of presentation.
Getting back to the ciphers, a low-res monitor displays 262 horizontal lines on screen
(that's the vertical resolution) which are refreshed (updated)
60 times per second. 262 lines by 60 times per second means
that a low-res monitor draws 15,700 lines per second in total.
That is a frequency of 15 kHz. So that's why whenever someone
talks about 15 kHz displays he's refering to low
resolutions. Although the truth is that a low-res monitor
never actually displays the whole 262 lines. A given number of
those lines just refers to the time the monitor needs to
restart to draw the lines from the top of the screen once it
reached the bottom. Actually, a low-res monitor will display a
maximum of 240 lines on screen. That's why the vertical
resolution of so many games is of 240 pixels, though 224 and
even 192 pixels were also quite of a standard, depending on
the year and that moment's CRT technology.
Conventionally, those monitors capable to draw 480 horizontal
lines (that is, they're able to work at 31 kHz) or more are high
resolution monitors. Your standard CRT TV works at 15 kHz but it
doesn't at 31 kHz, while your standard CRT VGA monitor for
your PC,
just the contrary. Due to how a CRT works, which opposes to
the digital display of a modern set (TFT monitors, plasma
TV's...), there's a wide range of horizontal resolutions and pixel's
proportions (yeah, pixels don't have to be square) which can
be displayed at full screen on a CRT monitor, hence the
different resolutions which both, arcade games and home
consoles have been using since their beginning. If you're
really interested in the subject, check out this
site, which explains it in detail.
Indeed, CRT's
are so versatile that a 15 kHz monitor/TV is able to display a
hi-res picture. That's a process which involves converting a
natively 31 kHz picture into a 15 kHz one and
it's called interlacing. It does exactly that - filling
the blank space (scanlines) between the lines by doing
a second scan. As they're two different scans done
consecutively, the process generates some visual inconsistencies, like flickering,
and will never give the same image quality to the one you'd get
from a true 31 kHz (VGA) display. Even so, some home systems (like
the Nintendo 64) work only on interlaced modes, that is, they
output the picture at hi-res but to be displayed only at 15 kHz. It was
the only way to get hi-res games work at that moment's
standard TV's. Though one comes to think if hi-res was really
necessary for that result and given the memory limitations, but
hey. So, and this is the important thing here, 15 kHz
does not imply necessarily low-res, but also interlaced
hi-res.
But
what about the opposite. What if you want to adapt a low-res
(15 kHz) picture for a 31 kHz display. Well, that involves
upscaling. And, since it has to be done by the renderer
hardware, it will be digital upscaling, your
on-screen presentation will be pixel-doubled, and,
technically, the result is a true hi-res picture. To
illustrate:
The point is that some popular hi-res systems
(DC, PS2, NAOMI and Atomiswave, etcetera), like the X 68000 at
its moment, are hybrid systems - They are able to
output true low-res modes. That's not all - The X 68000 was
sold with a multi-sync monitor, but some systems like the PS2
are intended to be played through a 15 kHz monitor, so any
hi-res-rendered game (which includes the fake low-res ones)
needs to be interlaced for a 15 kHz display.
Why do this,
then? As I mentioned, different systems do often use different
resolutions. If you're porting, for instance, a CP-S game like
Street Fighter II to the Dreamcast you'll find out that the 15 kHz
resolution modes of the latter have little to do with the
CP-S resolution. Indeed, the CP-S (and its successors) made
use of a quite odd resolution (384 x 224). Its aspect ratio
was still of 4:3, though, since the pixels were
prominently oblong. Dreamcast's low res mode is of 320 x 240, not
very suitable to reproduce the
original graphics at full screen without altering the aspect ratio
and keeping part of the actual area out of the screen. While this
is likely the most edgy example one could think of, it serves
well to explain the main reason behind the phenomenon, and is
also applicable to the games which just use old sprites on new
hardware (like The King of Fighters Neo Wave, to name one) or
emulated iterations with (almost) no actual resolution
discrepancies between the target console and the original
system. Basically, game developers don't want to waste their
efforts on redrawing graphics, re-adapting the game area and
adjusting aspect ratios in order to get a full-screen display when they just can leave it on the
hardware's hands. By using the hardware's scaling features at
640 x 480 pixels of display resolution, they get the picture
(whatever the resolution and pixel's shape it originally had)
to fill the whole screen, thus, preserving the aspect ratio
and the visible area, but also denaturalizing the game's look,
adding refresh rate issues and introducing an abrasive problem
of visual (im)precision usually stressed by hardware filters.
Blatantly enough, most of these fake low-res ports (due
to the fact they're usually PS2 games) are not 31
kHz-compatible despite being rendered internally at a 31 kHz
resolution. That is, they take a low-res game, convert
it into a hi-res one to finally interlace the picture for a 15
kHz display. Hilarious at best, if you ask.
Actually,
all this dissertation is only of relevance for those into RGB
monitors, should I add. It's totally pointless to care about
fake low-res and, at the same time, using diluted outputs such
as composite or S-Video. Scanlines are a must if we're
speaking about visual precision and true low
resolutions, and those are only effective with an RGB signal.
Obviously, it will also be useless for the ones who use
non-CRT monitors, since any low-res picture will automatically
be upscaled by the apparatus, no matter how it is output by
the gaming system. So now you know, this is another effect of
our digital era, full of standardization discrepancies
and lacking of people who actually do care a bit about visual
art, visual honesty.
Recap []
|