1

(9 respuestas, enviadas el English talk)

the width of the scan lines gets reduced (a single line at 15.625 KHz is 64 µs long, the same line at 16.200 KHz becomes 62 µs)

It's not a question of width (space) but time. The old name for chassis is "time base", because all you do is drive the electron gun in time. There is no space data, unlike flat fixed res' screens. That's one of the major difference between CRT and every other display technologies. That's why we talk about frequencies for CRT, because "resolution" isn't enough to clearly describe the capacity of a tube.

So, as the scan frequency increase, the line duration decrease. You also have a slight difference between PAL (64µs) and NTSC (63,55 µs). Every dynamique geometric distortion is a matter of delay added to the signal.

so the video gets more concentrated in the middle of the screen, burning the colors.Even if the constrast had been adjusted for video modes around 15 KHz, you could see how the colors faded to the right with a blue shadow as you change to video modes with higher horizontal frequency.

When I first tried to create my own modelines (especially to display very low resolution at the minimal pixel clock allowed by my card, for example 256x224 at 6.5 mHz ) , I made lot of tests. I noticed a fading for very "out-standard" resolutions, but from top to bottom, not from left to right, and no color tinted. Some resolutions were stable and readable, but only the first few lines were bright, all the other were darker and darker.


It would be good to check if voltage differences have to do with horizontal frequency variations or any other feature of the modeline

From what I saw, you have different results with the same frequency (same resolution), depending on the driver (or lack of it). But yeah, maybe different drivers use different pixel clocks and porshs to display the same thing, so at the end of the digital>analog conversion on the graphic card, you can have various voltages. Remember that this conversion is always a weak point in many electronics (except high-end products). For now I don't really know much about the subject.


To Recap: yes, the screen is already good, and so the plastic case ! ^^

2

(9 respuestas, enviadas el English talk)

Calamity, that's good you are here, I just would like to ask you few questions about drivers.

I am currently working on a software to get the best possible results from ATI videocards on CRTs by extending the regular driver capabilities

So, you made a hack to allow more different resolutions at a time. But do you know why all different resolutions can produce different voltage levels at the VGA output ? It's not a hardware question (few resistor or amps) , because values change according to different drivers and safe mode... For now, I'm quite clueless because I just noticed it recently, in the past few years I would never had the idea of mesuring the voltage out of the VGA port.
Because you know, wide range of resolutions is good, but signal accuracy is really crucial to get the best result possible on CRT. You will say that most video games have a very limited on-screen color palette (and that's true), but to properly calibrate a CRT, you need to have a signal between 0.3 and 1.0 V .
When you display 15 kHz picture at 321x240 for example, you can't change the values of gamma/brightness/contrast for each channel (red green and blue). You can do it at 480i, but anyway, you can get slighty variations between 480i and 240p ...

The ultimate driver would be the one that allows many different resolutions and a constant voltage signal (video part always between 0.3 and 1.0 V), or at least, a possibility to quickly adjust parameters in any resolution (like the gui of Zsnes, which suits every resolution, unlike many other emulators designed for 640x480).

Since PC users had never ever complain about voltage levels (who the fuck cares ? Even those who wanted accurate colors just tweaked the drivers options, whitout mesuring the voltages), I think this issue had been there for almost the beginning. I need to check other graphic cards, and different motherboards to confirm if there are some set-up that delivers more variations or more regular level among resolutions. (if you guys could do it with your graphic card, even with a basic 10€ multimeters, that would help !^^' )




I still have to find a proper explanation for the behavior observed by Daicon-X on his tv, a Bluesky 21". In order to avoid you the translation of the whole thread, I would say that this chassis has a "non-linear" behavoir, with seemingly random results from subtle changes in the input modeline, specially around 57 Hz. The most obvious problem appears as a block of lines displaced to the left on the bottom of the picture.

Bluesky is a brand of cheap TVs. While they have some good Philips/Thomson tubes in it, they got some crappy Vestel chassis ("11AKxxx") that break easily and are not well designed. And of course they come with a poor set-up.

In general way, digital electronics tend to be much simpler to design, they have a more regular behavior than analog one. So, on older analog chassis, you often find very wide tolerances on timings, porshs, frequencies. In digital ones, especially because they are designed according to very few broadcast signals (and not video games :P ), you tend to have very few room for your non-standard signals.

If you have troubles, it's better to display a non standard picture in a standard frame ( for example, a 304x224 picture in a 320x240 frame), and choose refresh rates close to 50hz and 60hz (meaning that you won't be able to have perfect emulation for 53hz and 57hz games, because they are too much out of range, generaly a 2 Hz only one).

In other way, you need to try several pixel clocks and porchs to avoid picture deformations. Some chassis won't show you any picture at all if the signal is too much "way off" (but the same signal can be send into a much "friendly" analog chassis...).

So, as usual, older means better, you will enjoy the wide variety of the 15 kHz picture on spherical analog TV sets. If you choose a TV from 1994 to 1996, you can still find "hybrid" chassis that have both digital (geometry correction) and analog parts (especially for the video amplifier on the neck-board).

Recap's Trinitron may be the ultimate TV, because it's a cylindrical tube with an early digital chassis that allows you to easily deinterlace 480i and still has lots of analog components, and it takes lot of resolutions around 15 kHz. Ok, it's a pain in the ass to properly converge it ^^, but I got many skills by working on it (in fact, this TV set had been my "conejillo de india" :P to test everything about CRT convergence and geometry).

3

(9 respuestas, enviadas el English talk)

This is the last reply of Pegote, but when I moved the messages here, I forgot that he doesn't  have access to this place. If he shows his email adress in his profil (after reading my last reply on the other topic), I will send him the explanation :


Pegote escribió:
Eboshidori escribió:

on flat CRT, it's a pain in the ass to get a good linearity

Maybe I'm only taking this particular line completely out of context, but: do you mean, for example, that vertical lines on a flat (flattened?) CRT can never be parallel?Like, is this a known issue that is unavoidable with this type of screen?

No, on direct view tubes, you always have parallel lines. The only tubes that can display a trapezoid image (with on side bigger than the other) are those in tri-tubes projector /rear projection sets (because it is requiered to converge the red and blue picture to the green one. But after a good set-up, from your point of view, you should see horizontal, parallel lines on the final picture).

The geometric corrections on direct view tube allow you to change the shape of the picture, but you always have parallel lines (you can have a trapezoïd picture, but with the top lines having a different size of the bottom ones, and still always parallel).

The linearity (in this case: the horizontal one) concern the variation of the pixels size from left to center, and center to right. To display a regular picture on any tube, you need to add some variations to the signal (wave forms), because the course of the beam, which should describe a perfect circle, never matches the shape of the tube. When you have a spherical (or cylindrical) tube, it's a different radius, but it's relatively easy to adapt the radius of the beam closely to the radius of the tube. When you have a flat tube, you need to change a radius into a straight line. And it's never perfect. It involves not only wave forms but also a really precise driving of the high voltages in the THT, something you can't find in large TV set for home market (because accurate electronic for such high voltages cost lots of money, and home TVs are not supposed to display graphics with straight stuffs, it's movies, broadcast shows etc.).
The linearity issue is really annoying for 2D stuffs, especially when you are a shmup addict like me, because the slow,regular, continuous scrolling always shows the defect. That's why I prefer spherical tubes over flat ones.
On spherical tubes it's not perfect too, but is much less visible, and easy to correct to a decent level.



I have to ask, because a while ago I spent some time with my 29" flat CRT TV trying to calibrate it to have a "straight" image and the best I could get (after a lot of swearing) was an image that kinda looked like this: ))|((

(Mind you, my TV is some locally assembled, chinese manufactured mammoth. As much as I'm pretty ignorant about all of this, maybe it's just the TV that's shit.)

A picture tube displays whatever you send to it. But if you want a good picture, with good geometry, you need electronic circuits that allows you to finely tune the most parameters possible, you need something well designed. And on cheap chinese TV sets, you can barely find something good enough to correct your signal. But on those TVs, you can find good picture tubes, because only big and longtime manufacturers make tubes (RCA, Thomson, Philips, Sony, Panasonic, Samsung, Nec...in fact, they are very few). Most time, a small compagny buy tubes to big manufacturers, then they design their own eletcronics to drive the tube or simply buy
cheap chassis made by brands that don't make their own CRT too, they put all together in a plastic case, they put a sticker in the front, and here you go !

The quality of a monitor relies heavily on the chassis (the electronics) that drive the tube, and on the proper set-up that needs to be done by trained technicians (or crazy old-school video game addicts :P ), because it's very difficult and involves lots of constraints and compromises. And much more, it requiered time to do it. Time is money, so if you want to sell cheap monitors/TVs, you choose an easy tube to drive (a spherical one), you put a cheap chassis, and you don't spend any time to properly tune the picture at the end of the assembly line.


For you your picture problem (with a  ))|((  shape), it's a typical pin-cushion issue (it's different from linearity). Normaly, even on flat CRT, you should be able to correct it easily, via the "service menu", in which you can enter by taping a special code on your remote (you need to know the reference of the chassis and find it on the web, on http://www.eserviceinfo.com/ for example).

4

(9 respuestas, enviadas el English talk)

So, back to PC at low resolution, with an issue I'm sure absolutely nobody talk about:

Voltage variations of the video signal at the VGA output.

The norm of the VGA signal is fortunatly the same of the home devices: 1 Volt.
The video signal is included between 0.3 V (black level, the screen is supposed to emit no light) and 1 V (peak white). So, you have a "0.7 V" amplitude, from which you will get 256 values to get a 16.8 millions color picture.

0.7/256= 0.002734375. Basically, you have 0.003 V (or 3 mV) for each of the 256 RGB values. Any variations of this range and you loose color accuracy. I won't bother you with shit load of measure tables, but I got lots of different measures not only for different resolutions between 640x480 and 1600x1200, but the same for anything under 640x480, and different level between R,G, and B signal (but when red is higher than the other, it is always higher in every resolutions, for example), and different measure according to genuine drivers, newest ones, and in the Window's  "safe mode" (absolutly no driver and no service or softwares to apply any correction of the display).

On my Ati Radeon 1250, with pure white full screen, no software calibration (raw ouput), it goes from 0.930 V (red signal, 1280x1024, safe mode) to 1.114 V (green signal, 640x480, safe mode). I never got the same results between resolutions, using the same mode (either safe mode or ATI drivers). I tried another card (Radeon 2400 HD, PCI express, which by the way can't go under 7.40 mHz pixel clock, meaning I cant' display anything under 352x256 :[ ) , and I got different results, with almost the same (wide) range around 1 V. Very few combinations of resolutions and driver gave me something close to 1V (what is normallly expected...).
You will tell me that you can modify the graphic card settings via software (in order to calibrate your screen), but it's very limited (it can be OK for a slight correction, say a 0.996 V signal, but when you deal with 1.114 V, it's too much way of). Now, I understand better why it has always been so difficult to maintain a good greyscal and to display proper tones under 5% and above 95% of intensity...
Damn, the graphic cards should send a signal in the 0.7 V range, and no more than 1 V, for every resolutions ! And most of all, the same level for every lines (R,G and B). The only time you would have to tune the levels  should be when you can't tweak anymore your monitor set-up (because of aging cathodes, at different level).
But when you think about the fact that almost nobody takes time to properly calibrate his PC screen (even most of graphist users), nor tweak the levels via software, plus the fact that back then almost everybody used the "brightness" setting in a wrong way (to get greyish black, not an actual brighter picture...), plus the too much high color temperatures... it's hopeless. I never read any advertising nor review of graphic cards saying : " it's magic ! Perfect video signal amplitude at any resolution !"

I wanted to use the PC to display a precise calibration mire (thanx to soft15kHz) to finely tune my TV screen (because the connection between G2, cut-off and gains is crucial to display the best scanlines, the sharpest spot), but as always, it's not "plug and play", you have to burden yourself with problems to solve. Damn, whoever create the PC is a devil ! ^^'

5

(9 respuestas, enviadas el English talk)

Recap escribió:

What do you use for 31-kHz gaming these days? A tiny 20'' VGA PC monitor? I hope you don't !

I use a Diamondtron (Trinitron licenced to Mistubishi) 22" PC monitor. 20.5" of visible picture (52 cm). An UNMATCHED quality, no other screen can display this resolution with such a precise and pleasant to the eye pixel, no other screen can produce this "scanline feeling", so "video games" like.
There's no way I would play 480p on fixed res flat screen (no matter LCD, plasma, DLP...). Even If I find one of those 4/3 40" 640x480 plasma (to avoid any loss of picture quality and lag due to scaling), I would not trade my little PC monitor. Even when you consider CRTs (the best technology ever for gaming :P ), I wouldn't buy a large 29" tri-sync monitor (from any type), because I know I wouldn't get such an amazing picture (because of the large pitch, and quite unprecise cathodes, focus system and yoke winding). Sure, I would still get a very nice picture (much better than any crap flat screen), but "plain" (= no "scanlines", no little black space between lines) and rather soft, with some moiré and convergence issues (which can be slightly reduced for the few tubes with a slightly smaller picth, and with a precise yoke that you will need to adjust anyway for convergence issues, because they become very annoying when you go to the limits of the tube).

So, for me, the ultimate gaming screen (only one, for every use, meaning every resolutions) doesn't exist.
You need CRT technology of course :D , but you definitely need at least two kinds of CRT:

- one for 15 Khz (and the few 24 Khz stuffs).
Easy, it exists since forever, it's the "standard resolution" tube, that can display the best 240p output in large format, direct view (29" to 40"). No other tubes can display such a large beam, with all the quality we like from the behavior of the beam (visible variations of scanlines size).
The curved Trinitron is your best friend, especially in tate, because it gives you not only the sharpest beam and precise scanlines (with nice variations of size according to colors, and still precise for the smallest spot size), but the cylindrical shape gives you an "Axelay effect" to all your vertical shooters, much more than a spherical tube (and of course a flat CRT ^^').
Playing some masterpiece like Layer Section (with lots of "3D" effects due to paralaxes and multiples sprites with their own speed for big enemies) on a finely tweaked curved Trinitron is really amazing, a quality you had never find in any arcade cab'. It's like you rediscover the game ! ^^

The old analog chassis (before 1994) are the best to provide you accurate and well balanced colors, with the least modifications of video signal, and easy geometric corrections. Digital chassis tend to have too much "enhancement" features (denaturation of the signal), and on flat CRT, it's a pain in the ass to get a good linearity (when you want perfect scrollings, something you especially want in shoot'em ups), and even difficult to adjust colors (because of design choice, every manufacturers wanted to show a model with the blues more blue than blue and reds more red than red, because the digital ships allow it easily).
But you can only find "raw deinterlacing" (as a feature in the TDA-8366 for example)  in digital chassis, and until I managed to achieve a separate circuit to do it properly (or someone else do it ^^), it's the only way to do it with genuine hardware and games.

- then, if you want the best 31 kHz picture, you need CRTs that can go higher than 31 kHz. But because they can go higher, it means they can't go lower than 31 Khz (when they accept 15 kHz input, they upscal it to 31 Khz).

The easiest and cheapest way is to buy a high-end PC monitor, you can buy some for a few peanuts (<- we use this expression a lot in french, for saying "ridiculously inexpensive" ^^' ).
I have 5 big 21-22" monitors, from 1995 (curved Trinitron) to 2002 (flat Diamondtron). They all produce nice pictures, but the last one delivers an amazing display. Each lines have the same characteristics as the ones you see from a regular 29" Trinitron at 240p. It's your 31 kHz stuff with a 240p look, Trinitron quality. Something that I personnaly highly appreciate. :P But OK, it's small...

To find the same quality on a bigger display, you don't have to look over regular tri-sync monitors who are basically "simple TV tubes" (large pitch) with electronics that allow 31 Khz (and think that many large "XGA" CRTs (that are supposed to handle 1024x768) have a large picth too, meaning you can't achieve the same quality than on a smaller PC monitor (fine pitch). No, you need to find one of those early HD TV in USA (around 1999, fine pitch and 4/3 format) or expensive (even in the second-hand market) broadcast/graphist monitors. Almost none of them support 15 kHz. Because they are basically PC monitors of bigger size (and they were damn expensive when they where out, and non available to home market).

An other way is to use a tri-tube projector. You need a dedicate room to operate it if you want a decent picture, and of course you need to know how to set-up the projector. You can also make a rear-projection system with tri-tubes projector (the basic 7" ones can produce a nice 800x600 easily and are cheap nowaday), so you can use it in a daily room, but you have the typical issues of rear-projection: narrow view angles, low contrast. Never as good as a direct view tube.


So, for me, in tri-sync monitors, only the chassis are interesting. Just buy the electronics and adapt it to a good TV tube (you find it for free on the streets, or buy it in repair shops for 10-15 €). But, in arcades, there aren't any Trinitron tubes, so all the chassis are designed to run on the classic shadow mask tubes, either spherical or flat. And the difference of quality between a 240p picture on a Trinitron and on a shadow mask tube will be even stronger for 480p stuff, because a Trinitron is better to display thin spot (but you always have the large pitch issue, a 29" large pitch Trinitron can't produce the same précision as a 22" low pitch). But, hey, it's a decent quality anyway, but from everything I saw until now (in arcades and home cab' of friends), the 480p result is far from my little 22" monitor. Even worse, a 480i non filtred picture on a classic Trinitron is more precise (because the shadow mask dont' have enough space to let the beam strike the phosphore in every situations). That's why I don't want to spend several hundred € (monitor and shipping) to buy a thing that my little 30€ PC monitor will put to shame (and even the old 1995 one, found for free in the street, will produce a better 480p picture :P).


Yeah, it's not easy, you need to know exactly what you want, what is available, and your degree of exigency.

6

(9 respuestas, enviadas el English talk)

[ split from the thread " El PC y la baja resolución." [>] ]


Pegote escribió:

Hey Eboshidori. I'm afraid I didn't make that post :vP and I don't have any of those beautiful monitors. I only linked to that post earlier (actually, to one of its photos) for reference. I even got in contact with that dude to ask him some questions about the monitors. (He bought a few for a video exhibition. 26 of them!) Me, I'm only gathering some info before I buy a monitor similar to the one mentioned before (a couple of composite inputs, and one 9-pin RGB input - no SCART.)

Still, your post is full stuff I should look into, so thanks for commenting!

OK, so, the thing you must remember when considering buying a tri-sync monitor: a broadcast/video wall/graphist monitor don't need video amplification to connect any home device that run at 15 kHz (because the signal is 1 volt for the consoles and the monitor input).
If you plan to connect a few arcade PCBs, it's very easy to add potentiometers (~ 220 ohms) to reduce the voltage amplitude.

If you choose to buy an arcade monitor, you have almost the same sort of picture tube, but you will need to buy a video amplifier (around 100 € for RGB stuffs), or you will need to modify a MGCD (a chinese PS2 to JAMMA adaptator with embedded amplifier, for about 50€).

Other way, if you are lucky, you can find one of those rare TVs that have a VGA input (15-31 kHz, no 24 kHz support of course). For now, I only saw Grundig ones. Other brands (Thomson, Philips...) did some TVs with VGA input, but they were "100 hz" (meaning every 15 Khz signals are modified to be "double interlaced", and when you have the possibility to skip this treatment, you get your 240p source transformed to a standard 480i, which is pretty bad).


Personnaly, I would buy an arcade monitor if I can find a 29" picture tube with a 0.68 pitch, instead of the larger 0.79 common one, which is really fine for 15/24 kHz but a little bit too large to display precise dots beyond 500 pixels per line. When you tend to match the number of pixels with the number of phosphore triads, you loose picture quality, and begin to see moire errors. For every CRT, you need to work bellow the limit of the tube. The pitch is one of those limits.

You can find smaller pitch for large tubes (like some Mistubishi "Megaview" monitor, some early HD CRT with 4/3 format), but in many case, they don't go bellow 31 Khz, and when they take 15 kHz signal, it's upscaled to 31 kHz. Because when you want to increase resolution in CRT technology, you need to have a electron gun which must have thinner cathodes to display a smaller spot... meaning it's no more capable to display a huge 15 khz spot.

For now, 0.68 is the lowest pitch I found for big screens that handle 15 kHz properly.

Recap escribió:

As far as I know, those weren't actually different to the ones they made for arcade cabs, though they got a shell and different connectors, depending on the model. For this one, he's not sure if it's tri-sync, hence the inquiry.

The picture tube may be the same (of the same quality, i.e same precision for the pitch, around 0.8 mm), but the electronics may differs on more than the connector. An arcade monitor requires at least 2.5 volts for a video signal at 15 kHz, and a broadcast/wall monitor may only need 1 volt. The wall monitor may not be able to accept composite sync (i.e. need of external splitter, or a special box as mentionned above).



I asked him about reprogramming his ATI driver for multi-sync users (as they're now, you only can use them for 15 kHz). He was trying to tell us that he wouldn't know how to properly modify the driver unless he gets one of these monitors, given that he doesn't want to release an untested soft.

OK, I understand better. But it's the same thing: even if you can set lots of resolutions for 24 khz and 31 kHz with the special driver... where are the games that run at those resolutions ? ^^'

24 Khz is 640x350,  496x384 or 480x384... That would be nice to run an Xbox 360 at 640x360 for example ^^, but you would need a 16/9 CRT.


640 x 480 is used by many, many games. And not only PC ones, you know.

There are many games that run at 31 kHz (from PC or other hardware), but it's 640x480, or 640x448 (the PS2 in NTSC, for the very few games designed for). I don't know games that run at a different resolutions. You have some mid 90' PC games that run under 640x480, but you could display them with black border inside a standard 640x480 space.



Beware that arcade monitors standard is 2.5 to 5 Volts for video signal (15-24 kHz), 1 volt for 31 Khz.
If you want to plug any home console or you PC driven at 15 kHz (1 volt), you need a video amplifier, otherwise you will get a very dark picture.

That too. Some elaboration on this for posterity's sake, please?

A video signal has several levels:

- 0 volts for sync pulses
- 0.3 volts for black level
- 1 volt for peak white (15 khz from home devices, 31 Khz from PC or arcade boards)
or 2.5 to 5 volts (peak white) for 15/24 kHz arcade boards, and CGA/EGA devices.

If you plug a 1 volt video signal on a chassis that requires at least 2.5 volts, you will get a picture, but the chassis will consider that everything from this signal belongs to the lower end of the grayscal, meaning a very dark picture on your screen (and you can't do much about it, even by pushing the contrast setting to the max).

So, you need a video amplifier, that will keep the black level (0.3 volts), and will amplify the margin between 0.3 to 1 volt (0.7 volt amplitude) to the required level, meaning that the peak white of the signals match between the source and the chassis of the monitor.





768 x 512 is actually used only by Sharp's X-68000, it seems. Would be nice to get it (or 512 x 512) on your 31-kHz monitor, but better yet, 512 x 512 interlaced on your Trinitron TV in order to de-interlace the games via the TV's feature and get them displayed at their design resolution. It's indeed a tricky subject and I still need to discuss it with Calamity. And deserves a whole article, I guess.

Yeah, because there aren't much games designed for full 512x512 and 768x512, I guess...

Recap escribió:

It's perfectly accurate, though! "Guinea pig" to you and me.

Yeah, "conejillo de Indias" is "Guinea pig" in english. But in french, it should have been "cochon d'Inde" (a pig from India). I don't know why they choose "rabbit" instead, and "rabbit from america", which does absolutly not refers to the "conejillo de Indias". And I still don't understand the sentence, it's a special expression in spanish ? Because there's no relation whatsoever with computer stuffs, nor monitor things...^^'


Ok, I just checked a topic where Pegote explain his situation:

http://www.avforums.com/forums/crt-tvs/ … input.html

So it's a modified wall monitor. And it does accept analog and digital signals by the same connector ! ^^'
(just have to switch between analog and TTL)

The 9-pin connector is clearly labelled "RGB HV" (HV for separate sync, on 2 pins), meaning that you may need an external sync splitter if you send a 15 kHz signal (composite sync, on one pin). The chassis may be 31 kHz only but I don't think so.

Polarity of the sync pulse: the 15 kHz video standard has negative sync (meaning below the voltage of the black level, set at 0.3 volts).
There are few sources that provide negative sync (H) and positive sync (V), but in most case, sync is negative, for both H and V.



(on the AVS forum)

Would i need to ground red, green and blue seperately at both ends? The barco pinout only describes 2 ground pins. As the hantarex and the Barco seem to be have been part of the same videowall at some point, do you think they are likely to have the same pinout on their respective 9 pin inputs?

Generally, you need to keep signal grounds separate along the cable, but at both end, they can be joined.
The pinout may be the same for Barco and Hantarex, but it depends on how the chassis work, what they need from the signal.


The barco showed nothing, but the hantarex at least gave me some sense of a signal (i.e multicoloured lines which responded to me stopping and starting the dvd).

I don't understand the last sentence...

9

(3.160 respuestas, enviadas el Hablemos de juegos)

Here's a bunch of quality screenshots for Gigantic Army. For great justice :

http://i37.tinypic.com/2efoxgi.png
http://i34.tinypic.com/c8ao6.png
http://i38.tinypic.com/2n6i6qa.png
http://i35.tinypic.com/2vsq1aa.png

Yeah, seems promising, with many good ideas, huge battles, good controls....and Bitmap ! :D

10

(139 respuestas, enviadas el Hardware del vídeo y emulación avanzada)

Hola ! ^^

I will try to respond to this topic, with the approximate help of online translators ('cause I'm french).


Me encantaría ofrecerme como conejillo de Indias pero yo tampoco tengo uno aún, y estaba un poco esperando a preguntarte a ver.

Funny translation :

" J'adorerais m'offrir comme petit lapin de l'Amérique mais je n'ai pas non plus l'un toujours(encore), et il(elle) attendait pour te demander un peu à voir. "

I don't know what a "little rabbit of the America" have to do there... ^^'


But OK, just a few words about tri-sync monitors :

Everybody still use the "CGA-EGA-VGA" naming to describe these monitors, which can lead to confusions, because those standards are digitals, and there are some digital only monitors, and some that accept digital and analog signals (on separate connectors).

When you have a tri-sync monitor, you should read : "15-24-31 Khz", not "CGA-EGA-VGA", because even if it concern the same frequencies, the latters are digital norms.

I saw a link to Starcab.net, even if it's not about a Hantarex monitor, so I guess your monitor is really an arcade one, and not a stand-alone broadcast monitor or something else. And man, in arcades, almost everything is analog video since decades. Your tri-sync monitor, like many arcade monitors, is designed for analog video, for several frequencies, and it has only one connector, so every signals will go inside (and you can be sure it's analog only, because when you have digital+analog monitors, you always have separate connectors).

You need to know the pin-out of the connector (pins for composite sync at 15 and 24/25 kHz, and separate sync at 31 kHz), and you need to know if the frequency selection is automatic or if you need to move a jumper on the chassis (on the big PCB under the picture tube).

Beware that arcade monitors standard is 2.5 to 5 Volts for video signal (15-24 kHz), 1 volt for 31 Khz.
If you want to plug any home console or you PC driven at 15 kHz (1 volt), you need a video amplifier, otherwise you will get a very dark picture.


Another thing: the same way you have lots of resolutions around the frequency of 15 khz (from 240x192 progressive to 768x576 interlaced), so you have lot of resolutions around 24-25 kHz and 31 kHz. The only limits are the margins of the electronics. They vary according to manufacturers, but in analog world, you have a lot of freedom, more than in old digital standard (with very limited resolutions and colors available for every frequency).

Some monitors can accept non standard frequencies (outside the basic 15-16, 24-25 and 31-33 kHz), but most of them are locked around those.
Virtualy every modern chassis (less than 15 years old) can accept out of range frequencies without damage. They either refuse the signal (telling you "out of range") or simply accept it but you won't be able to see a usable picture.
Some old TVs (15 kHz only) display a weird picture if you send a 31 kHz signal on it, but they aren't damaged, as long as you don't stay too much time. So, try many frequencies to see the margin of your monitor, and don't display out of range signals for too long, and it will be fine.
Anyway, there are very few resolutions used at 24-25 and 31 Khz, because there are very few games that run at those frequencies. When you talk about 24 kHz, most time it's Sega hardware (and few Konami ones), with two or three given resolutions, and 31 Khz is mostly used for 640x480 (I don't know if there are games at 853x480, for example).
Some Japanese computer may use different resolutions at those frequencies (like 768x512) , but the most diverse frequencies are around 15 kHz (and most of the best video games ever run at 15 kHz :D ).


So, I hope everything is more clear know...^^

11

(40 respuestas, enviadas el Hablemos de juegos)

Another video from the same user:

http://www.youtube.com/watch?v=i3xGgxWKo80

I wasn't very fond of Progear (horizontal scrolling and one-color bullet patterns) and didn't try Death Smiles,  but this time, I may give it a closer look.

12

(29 respuestas, enviadas el Hablemos de juegos)

So, I go to gamepsot.com to check the screenshots.........."View full size" : 1200 x 675. Sounds weird. A close look at simple elements (the "zeros" of the score) cleary shows it's not native res' (it could have been).

The same day, Gamekult, a modest french site (not as big as US sites), put the same serie of screenshots online:

http://www.gamekult.com/images/J000103987/150022/


Look at the first screenshot:

http://www.gamekult.com/images/ME0001288100/

resolution: 1280 x720. But, this time, it's really a genuine screenshot at its exact resolution. Look at the zeros, they are all identical, and show no artefacts of upscaling (just a slight artefact of .jpeg compression).

So, we can guess that Arc-System spread the same screenshots for every sites, but some sites decide to downscal them, just for the pleasure to add the more artefacts possible... (because, remember we are in the age of "crystal clear" displays...)

^_^ :P :P

13

(86 respuestas, enviadas el English talk)

Recap escribió:

It's not different in this regard, I guess, to plugging a standard video card into a 15 kHz TV with S-video and the TV mode.

The "TV-out" of graphic cards are heavily filtred !!! And it's not possible to remove the anti-flicker option. Soft15kHz is such a better tool to provide you a usable picture.



Good luck and keep us informed. Sounds like too much work.

A guy that created a ZX Spectrum clone ("Harlequin") from scratch made a website with lot of informations on sync generation. I'll ask him some tips regarding the raw deinterlacing circuit (how to properly generate the vertical pulses from the horizontal sync that have been splitted with the help of a LM1981).
Maybe I'll need to use a FPGA, considering the various sync (and porchs) of the different gaming consoles.



Speaking of dying things, do you know somewhere to get a 29'', multi-sync CRT (15-24-31) currently, brand-new, if possible? Hantarex has stopped selling them here in Spain. A shame, 'cause they even had them SCART-ready and with a shell perfectly suitable for placing them vertically too...

I don't know. Even Pentranic (in UK) closed their page about CRT monitors.

But you know, an arcade CRT isn't different from a regular TV tube. It has the same large picth (0.7 mm for a 27-29",  and even larger for Well & Gardner tube !), the guns are the same, same shadow mask.

Instead of buying the whole monitor, you can just buy a tri-sync chassis and plug it on a good tube with a nicely designed yoke (Philips, Thomson, VideoColor, Panasonic...). Wei-Ya's are a good deal, even if they don't produce the best chassis out there. For less than 150 € (shipping include, if you make grouped order with other guys, which is easier to do than for complete monitor), it's really interesting.

A nice thing to do would be to adapt those chassis to a Trinitron tube. I'm working on the subject, since there are important differences between Trinitrons and regular tubes (the gun structure and video amplificator board, especially the H-stat part, that doesn't exists for standard tubes).

Personnaly, I would buy a brand new 29" tube only if I could find one with a low pitch (around 0.4 mm).
Most TV tubes age well, espacially Trinitrons that can deliver a sharp spot even past 90 000 hours ( check it [>] : 89387 hours, 9 month ago).

I have a Sony PVM from the early 1990's (27"), and the picture is amazing, you could not guess it's around 20 years old.
The TV repairers who do CRT rejuvenation almost never do it for Trinitron tubes (there's no need).


When you manage to perfectly set-up the electonic that drives the tube, there's no distinction between what you can see in the best arcade cabs and your regular TV.
Most TVs sold to the public are poorly adjusted and deliver a crappy picture (with poor focus and bad grayscal) but it's a question of time and knowledge to overcome this.
Arcade monitors have better set-up, but you can encounter convergence problems and even grayscal tracking, too.


CRT technology is good technology. Since many years, manufacturers know how to construct good picture tube. The main thing is the electronic for driving the tube, how to set-up it properly. That's what I'm working on for many years. I've tested many chassis swap, yoke swap, tubes swap,  I came to the conclusion that there aren't "bad tubes", it's just a question of set-up (and, well, a good design for the yoke). So, I'm no more interested  of buying a "real arcade tube", only the electronics interest me.

14

(86 respuestas, enviadas el English talk)

Sure, but for games originally of 224 lines on NTSC machines (the ones that matter), it could serve. Anything's better than plain scaling.

Yes, in that case, it works fine. But "plain scaling" should be done only if resolutions are exact multiples (224 and 448, 240 and 480).
In CRT display, especially at low resolution, the most important thing is the precision of the line. If you want "full screen", you juste have to change the geometry size of the picture. But that's a thing lot of programers don't know or don't take in consideration, assuming that the typical customer will never be able to change the size of his picture...

When all game systems had lower resolutions and smaller color palette, nobody could strecth and filter the picture. The best choice was always taken (because there weren't other choice ^^) : crop the picture.
The first console to open the path to shitty filtring and bad scaling was the Dreamcast.


I guess that you'd get the same issues with the soft patch?

I hardly expect that changing just a few hexadecimal lines will magicaly offer a good regular scaling... The few modified lines (in fact, the same line reported several time in the iso file) remove the interlacing (set the 240p PS1 mode), but you will always get the scrolling artefacts, and blurry picture of course.

So yeah, just saving money to buy the PCB... :P (and I'll do the same for Kestui).


Just to clarify, for the games which just keep the desktop rez on full-screen mode, you don't force anything. They work flawlessly if you're at 320 x 240. For the games which get pixel-doubled on full-screen mode, I never knew how to force them for a 240p display, I'm afraid -- they auto-change to 480i.

Ok, I understand better.

If the software is able to display a 480p content at 15 kHz 480i, it may be possible to "remove" the sync pulsations of the interlacing before the output and display a 640x240 frame.
It's not exaclty a genuine 320x240, but it's impossible to make the distinction. ^^

Even if nothing is done on that way, as long as you display a non filtred 480i picture with the same content for each pairs of lines, you have the possibility to get a nice 240p picture by removing the interlacing in the TVs that allow it (or in any 15 kHz monitor if I achieve my little electronic circuit ^^).


It's nice how they turned a by-product into a wonderful artform, though

The strongest constraints are the roots of the most valuable fulfillments.

That's why video games die inexorably as time goes by.

15

(40 respuestas, enviadas el Hablemos de juegos)

When you read "Jamma" and "15k screen", sure you can think of glory bitmap and lo-res, but let's see the first accurate screenshots. ^^

On which system does the game run ? Same as Dai-Fukkatsu ?

16

(86 respuestas, enviadas el English talk)

It's the TV I'm using currently. I just need to find "Interlace", right?

Go to the service mode (with the remote), and check it. Sometimes, it's called " VCO " , or " XTAL ".


Some interlaced PS2 games should be originally 224 lines instead of 240, like, say, the ports of Psikyo games.

It's even worse, considering the PAL and NTSC resolutions: so, in many compilations, you have games with 224 lines that are stretched to 480 lines (PAL resolution of the PS2, 50 Hz) , and you have games with 240 lines stretched to 448 (NTSC resolution, 60 Hz).



Have you tried Mushi or Ibara? They're supposedly 320 x 240 originally. I know they're all filtered, but many people are de-interlacing them with soft patches and whatnot and they seem to work to some degree.

I tried Mushi. It's badly scaled on the Y axis (240 to 448) as you know, but even on the X axis !!! When the backgrounds scroll, there are several places where the lines are suddenly divided...
It's small, but it's anoying. And of course the game is filtred.

Here is a demonstration: take your genuine 320x240 picture [>], and apply a brainless scaling that keep the aspect ratio when it's not recommanded [>]597x448, instead of use a clean doubling on X axis (320x2=640). If the game had artefacts only on the Y axis, it should have been better. But doing the same way as Dai-Ou-Jou would have been so much better !!! -_-

I didn't try the patch for recovering the 240p mode, but I know the picture won't be much better. A filtred picture is an altered picture, a bad upscal is a bad upscal, and nothing can be done. You need more than a patch (that set the PS2 output in PS-one mode), you need to modify the game's code... Well, it's a bit tricky ! :P

Same shit for Ibara (some would say it's even worst, but I can't imagine how it could be worst...).




I'm thinking indeed about many possibilities. Windows games at 320 x 240 design resolution but with forced 640 x 480 mode in full screen, for instance. There're lots of doujin games like that (some do keep the desktop rez, fortunately for us with 15 kHz cards, but very few).

If the game runs at 640x480 with a clean line doubling, there's no need to force it to 320x240. Just run the graphic card at 480i, and disable the interlacing on the TV.

If you use Windows XP, when you force the game at a lower resolution, it may be filtred, as everything is filtred by now (on Windows 2000, the defaut image viewer didn't filter the pictures when you saw them at different resolution, for example).

And what about Capcom's DC games like Capcom vs SNK 2 or Marvel vs Capcom 2? You must have tested that!

If this somehow works (can't you harm the TV?) you've made my day.

It works, but you have heavy flicker on bright horizontal lines, and the picture isn't as sharp as a true 240 ouput...

There is no danger for the TV, the deinterlacing process doesn't modify frequency, timings and voltage amplitude of the signal. It's just a question of skipping some sync lines and the half-scanline of the regular interlaced signal.

Every old game systems that run at 240p use non standard signal, a video signal  with (deliberately) missing sync informations. That's the secret. :)

17

(86 respuestas, enviadas el English talk)

Stretching 448 lines out of 640 for a 4 : 3 display sounds like a bit too much, though, even for the best tubes out there.

Remember that manufacturers use the same chassis for several size (20" to 25", and 25" to 29", for example). You can take a 25" chassis and put it on a 29" tube. In analog chassis, you can change the potentiometers according to your needs, but for numercial chassis, you need to change the range of the values available (ex: a 25" chassis is blocked at 3F in its parameters, and the same chassis programmed for a 29" can go to FF).
Some manufacturers use the same chassis for 16/9 and 4/3 tubes, you just have to specify it in the service mode (and have a wider range for the scanning).

The increase of a picture size has the limitations of the electronic  but you can get good enough result to display a picture with a black border. This is not pure full screen, but its way better than a crappy filtred shit badly scaled.
From what I've read on boards, even players that use flat screen in tate don't use a full strecth for the picture: they tried the best % to minimise the artefats...






I'm interested in the interlace disabling without scan converter you mention though. Which TVs have it or how do you find it on the service menus?

First, you can only find it in numerical chassis (TVs from 1995 and after). In some TVs, it is explicitly indicated in some menus:

http://raster.effect.free.fr/15khz/480i … 8366_2.jpg

( other pictures at : http://raster.effect.free.fr/15khz/480i-240p/  )

This example is a 1995 Trinitron.
The TDA-8366 (from Philips) is available in various TVs (Sony and Philips of course, but others brands that don't make CRT themselves). Easy: set the "interlace" option to 0 or 1 to enable or disable the interlacing, for absolutly every incoming signal. This is raw deinterlacing (no conversion), the specifics sync pulsations and the half scanline (32 microseconds instead of the regular 64 ones) that are responsible of the interlacing process are simply skipped.

Other TVs propose you to change the setting of the crystal oscillator. By doing this, it can disable the interlacing. I saw that in some TVs (from 2000 to 2002, Thomson chassis). The problem is that the interlacing is back when you exit the service mode and reboot the TV... (maybe the deinterlacing was a bug and not specifically wanted ?).



Few months ago, I went to a friend's to install a Wei-Ya multi-sync chassis in his Windy II (Konami Cab'). We tried SFIV (PS3) set at 576i (yeah... PAL). The chassis didn't do the interlacing, showing the game at 288p. It looked great, but the only problem was the big flicker of the gauges. Any straight and bright horizontal lines tend to flicker a lot.

I tried to make a simple electronic circuit to skip the special lines of the vertical sync, but I had problem of image stability, and "fading" (the image gets darker). Now, since I use soft15khz and generate my own modelines, I know where the problem comes from (front and back porch).
At the time I didn't go further, because I noticed that the "deinterlaced" picture looks awfull if the source was filtred (either bilinear or anti-flicker)...
If the game or the system doesn't allow you to display a clean picture, it's not worthy to try to display 240p. If You try to regain scalines from a game with 240 lines that have been scaled to 448 lines (the typical PS2 case) and filtred, you'll get nasty flickering on few lines, and shitty picture anyway.

But when you start from a nice source (an emulator set at 480i with perfect line doubling and no filters, displayed on your TV with soft15khz), it gives you perfect results ! (and absolutly no lag)




[EDIT :] Display those 2 pcitures in your browser to see the magical deinterlacing in action ^^ :

http://raster.effect.free.fr/15khz/480i … z_480i.jpg
http://raster.effect.free.fr/15khz/480i … z_240p.jpg

18

(86 respuestas, enviadas el English talk)

By the way, another exemple of dumb scaling:

Ketsui port on XBox 360.

Well, how display the PGM resolution (448x224) in a 640x480 window, for those who wants to set their system at 480i, to plug it into a 15kHz 4/3 CRT ?
As usual, let's do some irregular stretch an blurr it !!
Considering that this resolution (640x480) is meant for 4/3 CRT, and not for 16/9 HD LCD with fixed resolution, they could have done it way better: they should have double the height of the screen (224x2=448) with nearest neighbor option (no filtring) and do nothing to the horziontal size. So, you display a 448x448 picture in a 640x480 windows. You just have to change the horizontal size in the chassis of your CRT, a perfect stretch with no loss of quality (thanx to the flexibility of the raster).

There are several ways to disable the interlacing of the picture (because the 360 can't do it, 480i is the lowest it can do): some chassis allow it (in arcade monitors and consumer TVs) , some scan-converter do it too.

Of course, this way of display Ketsui isn't available, and will never be (it could be done via some software patch, but nobody [ but me :P ] will demand it, and Cave won't do it even if there were enough people to demand it...).

While you can disable the interlacing of the game, anyway you will always get a shitty picture (stretched and filtred), with scrolling artefact...

Aw shit...

19

(86 respuestas, enviadas el English talk)

Recap escribió:

So you have already confirmed that? The shit's getting dumber and dumber. Do you know what's the default setting for the arcade version assuming it's on a Vewlix cab? Stretch to 768 lines for a full-screen display or native 720 for two black borders?

The Taito TypeX² has no HDMI ports. It uses VGA and DVI ports. 720p is not a XGA standard, nor suited for DVI (only dual-link DVI supports one of the HD resolution: 1080p. 720p isn't available. Everything for DVI comes from the XGA standard). The TypeX² never sends 720p signal, and nobody who actually played the game on arcades notices black borders on top and bottom of the screen (and I didn't saw any pictures of the cab showing the game with black borders).
So, KOF XII (and XIII), while being developped at 1280x720, is NEVER seen at its native resolution. In arcades, the game is stretched internally at 768p (1280x768), then stretched by the screen to 1366x768. Yeah...

That means the only way to display a "HD crystal clear" picture of the game is to use a home version (that can send true 720p signal) and use a true 1280x720 screen. Well, let's say that nobody have never seen the game with the maximum quality (i.e : non blurred picture), and nobody will ever do it.

You would say it's crazy, and very dumb. And it is.



Furthermore, notice that lots of screenshots you see on websites are fuckin' upscaled !!!

When you go to some big gaming sites, they show you some little screenshots of the game and propose you a " full size " option, clearly telling you it is native resolution. Wrong ! In many cases, it's an upscaled version from an "editor screenshot" (because they choose to send downscaled screenshots instead of something matching the actual resolution of the game).


See that:

BlazBlue screenshots (home version)...
http://uk.media.xbox360.ign.com/media/1 … mgs_1.html


One of the "full size" screenshot :
http://uk.xbox360.ign.com/dor/objects/1 … =mediaFull

Even if the picture size is "1280x720", it's upscaled !!! O.o


Here's a true native res' screenshot for comparizon:

http://jeux-video.portail.free.fr/previ … s3-045.jpg

Nobody never notices it !

Look at all the BlazBlue screenshots from Ign.com page, they are almost all upscaled. They are 1024x756 or 1024x614 screenshots at the begining.

You see the early "Hardcops:Uprising" screenshots at 720p?... Upscaled !!! :P :P


This is normal that almost nobody notices it, because even if people use "crystal clear" displays, they are used to blurry pictures, because their screens are almost never feeded at their native res', and anyway, as soon as something moves on the screen, it gets blurred (because LCD cells are too slow).






I guess I can't have everything. The bigger the rez, the more accurate the screenshots get, no wonder, but serves nothing if you need to downscale them in order to display it on your monitor. For game analysis, you need to show full screenshots, think about it. And I'm not disatisfied with the results I got, as I told you. Once I get them less blurry, that is.

I tried to take the best and sharpest pictures, tried several way to resize them, but at the end, it's the same problem: there aren't enough pixel to show enough details.

To take sharper pictures of the CRT screen, remember it's a question of shutter speed and underexposure. Try to find the good set-up to, to have low levels, just enough to be able to recover them.

20

(86 respuestas, enviadas el English talk)

Recap escribió:

I hear you on the KOF XII subject. I'm still to see how the game works with the system set at 480p and a 31-kHz monitor, though it can't do the background graphics any good, so I'm not holding my breath. What a waste...

What a PURE WASTE ! The most important things, on which you have your eyes every seconds, are the sprites. You can't take time to notice every little details in the background, while being able to perfectly know that the background has lot of detail (your eyes can't analyse every detail individualy, but can percieve the fact that there are lot of informations globaly).

And more importantly, you can play the game that way since the system (either, the 3-60 or the Type X 2) does support 480p. Your samples are great and necessary for showing everybody that KOF XII's graphics are ruined by lame design decisions and a full-of-shit technology

Yeah, they support 480p, but not with the maximum quality (ie: non filtred display).

The graphics of KOF XII are ruined first because of lack of 2D technology. Since many years, no hardware can display sprites anymore, everything is "flat 3D". That means lack of power and memory where 2D graphics need it.

I'm pretty sure SNK-P really began the sprites at 720p (like BlazBlue). But they quickly realised it wasn't possible to achieve the quality they wanted (lots of shades and frames). So they reduced the size (half the size), and relied on filtred shitty tricks.

When I saw Ryo's face for the first time (I expect it was one of the early fighter they started with), with a "missing line" for his mouth (as it can occur with 50% nearest neighbor resizing), I immediatly thought it.
If the sprite have been genuinely designed at 360p, the graphist would not have skipped this special line.

KOF sprites are great, even if they are only "360p". That wouldn't have been a problem if there were multi-sync large 16/9 CRTs in arcades (and at consumer level). But instead, we jump from 4/3 CRT to HD flat screens.

If KOF XII would have come in a nice 16/9 cab' at 24 Khz, everybody would have notice the gorgeous graphics, with lots of vibrant colors, great shadow and lightning effects, smooth animation... Resolution isn't every thing in a quality picture. But for many years, people are brainwashed to believe that resolution is the most important thing. They want so hardly flat screens with high number of pixels, but shitty screens that LOST their sharpness as soon as something moves (because cells are to slow to react fast enough to follow the animation). They want screens with "crystal clear picture", but screens that are 99% of time feeded with non-native signals (blurry picture). Well, I already talked about that...


and hopefully you'll create with them some short of a virtual museum in order to let the internet see and take notes, but... just that.

I expect more than that... ^^'
I expect hacked dash boards that allow non filtred downscaling, and other resolutions (853x480 for example).

Man, on this fucking generation of hardware, almost nothing is done at the native resolutions of HD standards. And when it's done, the graphics are less sophisticated than those done at "sub-HD internal res".


I do really wonder if there's not a true 768-lines mode on this game. Blaz Blue has it, and Type X 2's most common mode is WXGA, given the arcade monitor standards.

BlazBlue is 1280x768. WXGA is 1366x768. Even on the original hardware in a brand new cab, it is fuckin' upscaled and filtred (at least, only in the horizontal direction)... But anyway, BlazBlue sprites are always displayed at 1:1 pixel mapping, weither the height of the screen is 720 (for 1280x720 panels) or 768. The back grounds are in 3D, so it's easy to resize.

KOF is "pure" 2D. If you could set the game at 768p, that would mean that the backgrounds are cropped when you see them at 720p. But no, there do not exist extra pixels beyond 720p. The window of the graphics is 1280x720, what ever screen will display it. Since there are barely flat screen at this exact resolution, you will always have your "crystal clear" picture (:P) upscaled and filtred, in association with the internal blurring of the sprites... Aw shit... where is my hammer ? ]:) 


I'm sure there'll be an option to remove the filter (at least for home versions), but in the end, same issue as in XII -- you can't play the game and please your eyes at the same time.

I don't think so. For KOF XII, since the sprite have an exactly 200% upscaling (round number, constant pixel size), they considered the option of disabling the filters, but for KOF XIII, if you display the sprites without filtring them, they will look fuckin ugly with enormous jaggies. They will look uglier than those unfiltred in KOF XII.

The only correct way to display the game without filtred or jaggy sprites is to set the resolution at 853x480 (and have a display that can resolve this resolution without artifact... Some old plasmas, and of course CRTs).



The truth is that I love the slight 'melting' with bright red lines and whatnot. I'm so used to it that I like to think of it as a particularity of CRT displays (and indeed I miss it in your photographs). But I can understand that, while it may help the overall visual enjoyment depending on your tastes, it goes against pure 'pixel analysis', yeah.

I understand that, but when the ultimate goal will be reached (perfect CRT simulation), you can have shitload of options to mess the screen (adding misconvergence errors, set-up the screen out of focus, overdrive the red gun etc.). But FIRST, it's important to achieve the perfect thing, the top, so after that, you can easily "go down".

If you consider bad set-up as the main thing, it won't be possible to go beyond that... And that's why most people interested on this difficult subject are all wrong, and can hardly propose something decent...
(well, by now, some dude might have reached some good point, but they still lack the essential knowledge of CRT, and are still far from the perfection).



Looking forward to it, though you need too big resolutions for that and you know web design requirements... Nevertheless, your ultimate goal is solving to some degree the emulation-related issue, isn't it?

The goal is to solve the emulation matter, and offer a scaling algorithm that allows developper to continue to create pixel art in the fixed resolution HD era.
Beyond that, it could even help to play old video games on modern screen (because CRTs won't last forever...) with top quality visual and with almost inexisting lag (about 1/60 second, something barely noticeable, and way better than any scan-converter available now,).


About web design requirements, I'm working on it, but even the large resolution you propose (about 1280x960) is barely enough to show something accurate. Even the best picture I take from a real CRT can't show enough detail when reduce around 960p, either one way or the other...

The best I can reach by now is that:

http://raster.effect.free.fr/tv/Photos_ … na_02f.jpg


Note that this picture is taken from a low curvature Trinitron (no cheating, it's a real old 29" low-res tube, not a computer one! :P ), from 3m from the screen. The curvature seems very low at this distance, and the focus of the picture is good even for corners.

Each time I try to go under that, the picture shows lots of artifacts ([>] like this...).
I can reduce that with a little bit of horizontal blurr, but anyway, the picture will always loose the smallest details of the surface of the CRT screen (i.e. the phosphore layout).

Damn, do you realise it ? All those fuckin "High Resolution crystal clear" screens of today can't show the magnificence of a low-res picture tube at one time... :P :P :P

The best way is to provide full shots with a necessary lack of phoshore layout (something that seems too much  "clean" to you), and propose shots of small parts of the action. Something that can shows the greatness of pixel art displayed by CRT, like [>] this or [>] this, maybe...

21

(86 respuestas, enviadas el English talk)

Recap escribió:

I've learned that the 640 x 400 mode of these is not 31 kHz, but 24 kHz. That technically is not 'high resolution', but 'extended resolution', as some put it.

400 lines is a little more than 24 kHz. At this frequency, you can only display 384 lines. It must be 25 kHz (400 active lines + sync lines). But yeah, it's a minor detail. ^^'

In CRT technology, the frequency is the most important indication for resolution because it determines how many lines you can display in a fixed area. The higer the frequency, the little space between lines, so the better the picture looks. Remember than nobody but us, video game addicts, likes black horizontal lines in their display. ^_^
Everybody else try everything possible to remove those lines (de-interlacing, line doubling, line trippling etc.).

400 lines in the full height of a 4/3 screen have most space between lines than 400 lines with black border such as in the graphics displayed by the PC-98. Those 400 actives lines have the same space than a genuine 640x480 picture. The hardware could have run at 31 kHz with lots of sync lines or black borders via software, you couldn't tell the difference.


Macaw escribió:

Anyway, in the case of most PC88 stuff and some of the 98 stuff, Its incredibly bizarre that they had to design stuff in 640x200, knowing that the height will be doubled.

Recap escribió:

I hardly believe that they used square pixels-based video hardware in the early years to design game graphics. They most likely used hardware capable of the same video modes as the target platforms -- remember that work stations and PCs weren't always like they are today and that there was a era where the pixels' aspect ratio was indeed a quite changeable thing thanks to CRT technology.

It's cheaper to increase horizontal size than vertical. Because with horizontal increase, you only need memory. With vertical increase, you need memory AND higher frequency hardware (higher bandwidth).

Back then, nobody cared about square pixel, because CRT technology is very flexible (on each scanned line, you can display whatever number of pixels you want, according to the maximum pixel clock available). Square pixel is a thing from the begining of HD era (back in early 1990's), in an effort to standardize everything. Fixed resolution panels heavily supported this movement.


Recap escribió:

Pasting here the promised PC88 Dios screenshot at its native res with 15 kHz video hardware. This is how the game looked like originally on the PC88:

http://i30.tinypic.com/2v8k9oy.jpg

Notice, despite the blurry pic (I was out of luck this time), that it has little to do with what you get with standard PC video hardware. Graphics look almost like a 16-bit console game.

C'mon boy, you need more practice (and follow my advices) ! ^^

Look at this:

http://raster.effect.free.fr/tv/Photos_ … 200_02.jpg

( [>] smaller version )


Other stuffs :

http://raster.effect.free.fr/tv/Photos_CRT_3/

Haaaaa... nothing can beats a true 15 kHz display ! :D

22

(86 respuestas, enviadas el English talk)

Recap escribió:

I hear you on the KOF XII subject. I'm still to see how the game works with the system set at 480p and a 31-kHz monitor, though it can't do the background graphics any good, so I'm not holding my breath. What a waste...

Most multi-sync/31 kHz arcade monitor have the same large pitch as usual TVs driven at 15kHz. That means you can't obtain such precise scanline (as on the picture shown before) at 31 kHz on these.
The electron gun is conceived for bright and large beam spot, and you have limited number of phosphore triads. You need a 21 or 22" PC monitor with thin electron gun and precise grid (but small screen size...) or an insanely expensive 32" CRT Broadcast monitor that can really display 720p.  You can also have good results with CRT projectors (but not as good as direct view CRT).

I don't own an Xbox 360, but I'm pretty sure that the downscaling to 480p is filtred. Anyway, even if it's not as precise as the picture posted above, it will alway be more pleasant to the eye than rough cubic pixels at different resolution of the background, and of course blurry filtred sprites.

This picture [>] is based on a nice .png screenshot of KOF XII (720p), reduced to 360p with nearest neighbor option. Perfect pixel for the sprites and the background, displayed in heavenly conditions on a performant CRT. Each time I look at this, and then go to see the game on 768p or 1080p flat screen (because there are very few true 720p flat pannels out there...), I'm so angry. I just want to take a hammer and smash the screen. :P

Considering that KOF XIII only runs with filtred sprites upscaled at 150%, there's definitly no chance to try any tricks to display a nice picture on the best CRT displays.

Just for the tears:

http://raster.effect.free.fr/tv/photos_ … s_comp.jpg

> Part of a 720p KOF XIII screenshot on the same Trinitron monitor, driven at 31 kHz. On the left, a native res' sprite of Iori, and on the right, the sprite with 150% filtred upscaling.
:-[


Even if you take the signal and converted it from 720p to 852x480 (a 66% downscaling in order to recover the original size of the sprites and have the same definition between background and sprites), it's impossible to obtain the same precision and gorgeous visuals as the KOF XII 360p picture.


Men, I hate fixed res' pannels, and I hate the "everything scaled and filtred" shit.

What's the point to have easily adressable pixels (the only advantage compare to CRT) if most time you get blurry picture and artifacts ? Flat pannels are just good for internet browsing and word processing. For everything else, you need true multi-sync monitor. And CRT is the only technology that can gives you this.



We have here a 'true low-res through your PC' thread [ > ] where we go a step further than most people and also explain how to get the card to run at the proper refresh rate for every particular game. If you happen to have an Arcade VGA / old ATI card and haven't gone through the issue yet, you have there an interesting read.

Yeah, I suspected you would already know the software ^^, or have tried an arcade VGA or something else.
I knew there were solutions to drive a graphic card at 15 kHz, but for many years I didn't want to hear anything from PC stuff and emulators. I was fed up with, and went to original hardware: nice CRTs, real consoles and arcade PCBs.

The reason I finaly try soft15khz few day ago was the creation of the best patterns for convergence and geometry adjustement.

You can check this:

http://raster.effect.free.fr/15khz/CRT_set-up/mires_240p/mires_240p_conv_cross_R+B+M_v6_centre.png

(others: http://raster.effect.free.fr/15khz/CRT_set-up/ )

Very useful, and much more efficient than any stuff you can find in PCB service menu, or any regular convergence pattern available.



I think it's people like you the ones who are going to enjoy the next Postback update the most. I talked with Ronan about asking you how to properly photograph a CRT, since it seems it's a matter you have lots of experience with. I've learned some bits since that Willow screen I linked in a previous post here and I'm more satisfied with my results every day, but maybe you still could show me some tips for the camera. My Trinitron is 'high curvature', though, making the screenshots less... 'consistent' around the corners. If you agree, we could follow the conversation in the development subforum.

Without modesty, I can say that I'm the expert of CRT photographing. :P
With my advices, you will achieve such quality  as this :

http://raster.effect.free.fr/tv/photos_ … G-SvsC.jpg
(it's reduced at 50%, the original picture is much precise, but very big, about 5 MB)

Near perfection CRT set-up and proper way to capture it with your camera.

This is the best you can obtain from a curvated shadow mask picture tube at 15 kHz. The beam spot is precise at every greyscale, even on bright red.
And about the matter of the red beam:

Keep in mind that true scanlines aren't perfectly regular black lines; they vary according to the colors they separate, being even virtually invisible with some colors like red if the screen is not too big

The red beam is alway driven stronger than the green and blue (because red phosphore is less efficient and needs more current), but with a proper set-up of every parameters, you can obtain precision at all levels.
So, the usual blur and merging that occur with bright red portions are not a characteristic of the CRT picture, it's just a question of the quality of the adjustment. Hence, we see that carefull observation is not enough, you need to know the way to set-up a CRT to determine what to do for recreating a CRT picture in digital environement.


Most TVs and arcade monitors are not set-up to their maximum possibility (especially for convergence), because you need time to do it, and because 15 kHz screen are primarily conceived for bright picture (for use in bright environment) and interlaced signal, with a necessary melting of lines. Precision is not the main goal for 15 kHz interlaced picture tube. But we, pixels lovers and users of non-interlaced signals, we want it, because it's possible !!! :P

Lots of people think that each field of the interlaced signal ("half picture", at half the resolution) is supposed to be displayed with a large space between the lines in order to fit the lines of the next field. That leads to think that "scanlines" are plain black lines with the same size of the line displayed. Hey no ! First, scanlines are the lines that are displayed (that are... scanned), and the black space between them is not regular, not as large. But for convenient sake, people continue to call "scanlines" the black space they see in 15 Khz picture, and they continue to view it as regular lines. It's difficult for them to understand why the beam vary in size. But this is nothing less than the fundamental part of the CRT technology, and the most important thing to achieve a good CRT simulation. But when you see some atrocity like this:

http://www.bogost.com/games/a_televisio … ator.shtml

Then, you know that people are very far from understanding the point...


I've been working on a way to easily recreate the variation of the beam:

http://raster.effect.free.fr/15khz/samples/x4_L_std.png
(example of a bright pixel)


http://raster.effect.free.fr/15khz/samples/x4_D_std.png
(example of a dark pixel)


Even if in reality, the spot would have different sizes, for digital environment, I choose to use the same size and create the illusion with shades of colors.
A pixel that would have been a 4x4 square in a classic 400% nearest neighbor scaling becomes a 6x4 sample (the core is 4x4, and you have two extra columns depending if the beam size increases of decreases).


The goal is to "interpret" the picture in only one pass (to have minimum use of resources and minimum lag), directly taking each original pixel and "construct" the final result according to a table that store small sets of values ( 1A to 4F in the examples above) for every variations of greyscale. There would be only one comparison with each previous pixel per line, to determine if the beam increases or decreases. Well, it's a simulated scanning (from top to bottom and left to right) to obtain the data, but the result is made of a full frame (no layers, no transparencies), constructed for fixed resolution.
It may not be very clear, because it's not so easy to explain it in french even for me... ^^
But I think this is the best way. The difficult part is to create the table, you need a perfect screenshot to start from. For this, I will use my method, because it's very tweakable and precise. I need to adust every layer according to the observation of the behavior of the real beam in the best condition : the proper set-up of the screen. For this last point, I'm ready now, I know. :-P

I need to explain the trick and convince somebody to write a bunch of code lines, somebody who wants to recreate a perfect CRT picture, not somebody who think that CRT screens produce genuine dirty pictures with lot of defects that you need to focus on. In fact, that's the tricky part ! ^_^

So, wait and see...

23

(86 respuestas, enviadas el English talk)

Hello low-res lovers !

First, I'd like to congratulate Recap about this excellent website.
Few years ago, the very first thing that took my attention was the care for the screenshots, a huge difference between every other sites with the usual jaggy or blurry pictures. Then I read the reviews and others stuff, and definitly I knew I was in the good place.

I've been working on the subject for long time, looking for the perfect scanline ( right after "the perfect beat" :-P ). I've been working on many different CRT, taking tones of pictures, understanding the complex way to properly set-up those screens. The goal was to be sure of what is possible and/or desirable to recreate with a raw output from an emulator. I'm still working on a huge tutorial about the set-up of CRT, especially the difficult part of convergence.

Many people in the emulation scene wants to recreate the feel of the CRT, but they have in mind that the picture tube gives "genuine" dirty visual, with lots of defects. They focus on the bad points (not on the right behavior of the beam under the best conditions). I can't stand this, because I'm an ardent supporter CRT precisely because this technology gives you the best picture at any resolution, if you feed it with a good signal. For example, take a look at those beautiful pictures of KOF XII sprites at their native resolution (640x360) on a Trinitron PC monitor:

http://raster.effect.free.fr/31khz/KOF_ … 66x768.jpg

http://raster.effect.free.fr/31khz/KOF_ … 360p_s.jpg

http://raster.effect.free.fr/31khz/KOF_ … 60p_2s.jpg

(others: http://raster.effect.free.fr/31khz/KOF_sprites_360p/ )

If only you could see it in front of the monitor, it looks georgeous. Very "15kHz-like", very "video game graphics". The deep black with the perfect colors of the tube give you an unmatched result. I can't have enough of the scanline effect. ^_^
And on this subject, I must say that Ronan has achieved the best result to recreate those beloved scanlines. His method is even better than mine, because it shows the behaviour of the beam at any greyscale. My method is more precise and "tweakable", but if I want to handle all greyscales, it becomes very laborious. His method is really simple compare to mine. So, nice job, Ronan. "You're doing great !" :-)

By now, I'm working with the wonderfull "soft15kHz" software that allows you to drive your graphic card at 15 kHz, with every possible resolution in this frequency, progressive and interlaced. Fuckin'great, I'm like a little boy in a big toy store. :-P
Very helpfull for working on non square pixel (like 448x224 or 512x240).


Before I log out, just for the pleasure, a full shot of a 15 kHz picture (Trinitron, low curvature):

http://raster.effect.free.fr/tv/photos_ … -Leo_s.jpg


See ya'