• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Upscalers, CRTs, PVMs & RGB: Retro gaming done right!

Status
Not open for further replies.

danielcw

Member
Agreed, hate having to deinterlace that shit when I record most PS2 games.

Why would you ever have to deinterleace?
If it is a 30fps game, both fields are from the same frame.
If it is a 60fps game (or variable fps), every field is its own frame, so treat it like 240p.
 

Sixfortyfive

He who pursues two rabbits gets two rabbits.
Why would you ever have to deinterleace?
If it is a 30fps game, both fields are from the same frame.
If it is a 60fps game (or variable fps), every field is its own frame, so treat it like 240p.
Older XRGBs actually do handle 480i60 without going through true deinterlacing; that's part of the reason why they're so fast compared to other upscalers. They just double each field and display them as separate frames.

The problem with this, though, is that the screen wobbles up and down quite a lot as a result. The XRGBs offset the position of every other frame a little to compensate for this as best as possible, but it's still noticeable.

There's no perfect way to handle 480i60. No matter what you do, you're always forced to account for some visual information that just isn't available.
 

BocoDragon

or, How I Learned to Stop Worrying and Realize This Assgrab is Delicious
Aside from save/load profiles, what does the new XRGB mini firmware do?
 

BocoDragon

or, How I Learned to Stop Worrying and Realize This Assgrab is Delicious
The faster switching between source resolutions was a myth, I take it.
 
The faster switching between source resolutions was a myth, I take it.
Settings? The still look borked to me.
Just threw on Saturn Nocturne, yeah still pretty slow, 4 seconds-isn.
Yeah, I'm thinking this was a myth. Think the guy one shmups was messing with sync settings and that's why.
yeah 4 seconds for saturn SOTN sounds right

i'm honestly not sure, i loaded that one guy's Genesis & SNES profiles, think the SNES one is what i ran with...scanlines lookin good, status says 1080
Scalines for 1080p work with Smart 1x or 2x but not with anything else. turn on 1080p output then select smart 2x and scanlines and they should be fine. Any sort of zoom will overright the smart 2x scaling, so you can't use it.
 
Don't have a monitor.

This is the cheapest splitter I've seen so far, apart from Y-cable things that would probably be awful.

Will have to look into some S-Video capture cards tomorrow.

Take a look at these guides: https://forum.speeddemosarchive.com/post/complete_creating_sda_acceptable_videos_ezcap_guide_.html

The first one specifically has the info that you're looking for. I went through setting this stuff up just yesterday actually. I ended up getting the Dazzle DVC 100 pretty cheap on ebay, and my PVM has A/V out so I didn't have to worry about splitting the signal, but here's a link for that: https://kb.speeddemosarchive.com/Splitters

You'll have to have a PC nearby to hookup the capture card to, as well as run AmaRecTV or whatever software you end up using.

I remember seeing that XCAPTURE-1 card when I was searching on how to set this stuff up, but I couldn't justify spending around $300 when I was just getting into recording stuff.
 
I haven't had sync issues that are related to this so I'm honestly not sure. it might be safer to just get the powered one but I honestly don't know what issues it's meant to fix.
Is there any potential problem with having a sync stripper on your cable if you don't need one? Will it just do nothing, or will it mess with the signal?
 

baphomet

Member
Is there any potential problem with having a sync stripper on your cable if you don't need one? Will it just do nothing, or will it mess with the signal?

It won't work if the cable isn't wired for 5v.

There's literally no reason to use a sync stripper with an xrgb mini. Earlier firmwares wanted csync for some consoles, but that's no longer the case.
 
Yeah I wish there was a wiki that would tell you the optimal cable selection for a given console -> FM -> display type, to sort out all this csync, sync on this or that, sync splitter, audio splitter stuff. Is such a thing even possible?
I think it would be like "NTSC Saturn>>>SCART with boosted sync>>>FM>>>HDMI"
Above is just an example, I don't know whether Saturn needs boosted sync or not. But one could probably compile a list of the 3 or so versions of each console in each region and account for which cable you need whether you're connecting to FM or BNC monitor inputs. The thing that freaks me out is when retro_console_accessories has 2 cables for the same system, I'm like ok which one is it?
 
It won't work if the cable isn't wired for 5v.

There's literally no reason to use a sync stripper with an xrgb mini. Earlier firmwares wanted csync for some consoles, but that's no longer the case.
baphomet to the rescue. So say I wanted to use a Genesis Model 1 on both my FM and on a BNC monitor, would the same cable be good for both? If so, is that the case with most consoles, that the same retro_console_accessories cable is generally good for both?
 

baphomet

Member
baphomet to the rescue. So say I wanted to use a Genesis Model 1 on both my FM and on a BNC monitor, would the same cable be good for both? If so, is that the case with most consoles, that the same retro_console_accessories cable is generally good for both?

It would just come down to if youre BNC monitor requires csync or not. Ive yet to come across one that does, but I know some older ones as well as stuff like the NEC XM line require csync.

Just to make it easier for your Genesis 1 question, just get the cable with the boosted sync. The Genesis is the only thing I ever had issue with on the XRGB and it was due to the weak sync signal the system puts out. Ive literally plugged every other system I can think of in there and had 0 issues.
 

D.Lo

Member
There's literally no reason to use a sync stripper with an xrgb mini. Earlier firmwares wanted csync for some consoles, but that's no longer the case.
I needed amplified csync from Sega consoles (Master System, Mark III, and Mega Drive II) to get a picture without sync dropouts. Never any issues with any other consoles however.
 
Older XRGBs actually do handle 480i60 without going through true deinterlacing; that's part of the reason why they're so fast compared to other upscalers. They just double each field and display them as separate frames.

The problem with this, though, is that the screen wobbles up and down quite a lot as a result. The XRGBs offset the position of every other frame a little to compensate for this as best as possible, but it's still noticeable.

There's no perfect way to handle 480i60. No matter what you do, you're always forced to account for some visual information that just isn't available.


Sorry if this is a stupid question, but what exactly is the "480" in "480i"? Is 480i always considered 480 height? Its just a bit confusing because some snes games have 480i modes too. Does that mean some snes games go from 240 vertical to 480 (640 would be horizontal, or something I guess?)?

And if interlaced modes are frowned upon, would it better to use GSM to force 480p with component cables on ps2 games that are natively 480i? I don't see how it would be so bad if they were natively 480 vertical, unless 480i refers to 240 vertical. Does that mean most ps2 games are natively 240 vertical? Sorry again if its a really dumb question but I am legit curious I guess.
 

Sixfortyfive

He who pursues two rabbits gets two rabbits.
There are 3 simple GIFs in the OP that try to demonstrate the difference between 240p, 480i, and 480p.

240p.gif
480i.gif
480p.gif


480i is twice the resolution of 240p, but only half of each frame is actually drawn to the screen. In still shots, this effectively gives you double resolution, but in motion, results can be mixed.

As D.Lo said earlier, 480i isn't terrible on CRTs, because CRTs are actually capable of correctly updating each half of the frame as it comes in. Flatscreens don't do this. They have built-in deinterlacers that try to interpolate the missing half of each frame, and for gaming, that usually produces ugly results. Furthermore, flatscreens tend to not be able to tell the difference between 480i and 240p in the first place, so they'll still try to deinterlace a 240p picture with the same methods as 480i, which is just nasty.

These are the main reasons why dedicated upscalers like the XRGBs are beneficial in the first place. They do a way better job at handling 240p/480i signals.
 

D.Lo

Member
Sorry if this is a stupid question, but what exactly is the "480" in "480i"? Is 480i always considered 480 height? Its just a bit confusing because some snes games have 480i modes too. Does that mean some snes games go from 240 vertical to 480 (640 would be horizontal, or something I guess?)?

And if interlaced modes are frowned upon, would it better to use GSM to force 480p with component cables on ps2 games that are natively 480i? I don't see how it would be so bad if they were natively 480 vertical, unless 480i refers to 240 vertical. Does that mean most ps2 games are natively 240 vertical? Sorry again if its a really dumb question but I am legit curious I guess.
The numbers refer to the vertical resolution of the output.

P and I refer to whether the output updates every line at once, or alternates between every second line each frame.

480i is the standard NTSC SD resolution, of every TV released for like 70 years (ignoring PAL for now). They went with interlacing because it created the best motion for the least bandwidth for video content.

240p was a hack invented by game console developers to get a stable image onto screens while using less bandwidth. It sends the TV (which is expecting 480i) only half of the lines, and the other half remain blank. This creates the black lines referred to as 'scalines' (though in reality they are 'unscanned lines'). It's the standard output of every console from Ataris until N64.Some games on later consoles still used it, the last console supporting it being the Wii.

For the PS2/GC generation, 240p wasn't enough anymore. But most TVs didn't support 480p (get to that in a moment), so their standard output was 480i, just like a TV station's signal.

480p was a fairly late inclusion in SDTVs, and relatively rare. By updating every line of a 640x480 image, you get a more stable picture with increased temporal resolution. The static resolution is identical to 480i, so for a screenshot of a static screen they will look identical.

480p, while a nice bonus on a CRT, becomes very important when moving to fixed-pixel displays (LCD, Plasma), because these displays are progressive by nature, and don't 'scan' lines across the screen like a CRT does. Hence they have to convert an interlaced picture to a progressive one to display it. And these converters in the screens are usually optimised for video, and very bad for sharp gaming pictures, so you get a smeared, crawling picture.

Edit: Also, what Six said!
 
Thanks a lot for all the information. Sorry by the way I did not see the OP before.

I don't know how to explain this really and I guess it's only applicable to LCDs but when you guys say "half frame" during interlaced modes, does that mean it would run at 30 fps, but rendered at 60? Or half the fps rendered at its true fps?
 
Thanks a lot for all the information. Sorry by the way I did not see the OP before.

I don't know how to explain this really and I guess it's only applicable to LCDs but when you guys say "half frame" during interlaced modes, does that mean it would run at 30 fps, but rendered at 60? Or half the fps rendered at its true fps?

literally half the image is rendered per frame. like the example 645 has.
 

D.Lo

Member
Thanks a lot for all the information. Sorry by the way I did not see the OP before.

I don't know how to explain this really and I guess it's only applicable to LCDs but when you guys say "half frame" during interlaced modes, does that mean it would run at 30 fps, but rendered at 60? Or half the fps rendered at its true fps?
The temporaral resolution is halved, so yes, each pixel is only truly updated every 30fps, on an alternating schedule.

But because half the screen is updated without the other half being erased, you do get 60 unique images on the screen per second.

On a CRT, the 'illusion' of a full screen 60fps is maintained quite well with the alternating lines, because on a CRT the whole screen is constantly moving and constantly updated anyway - each scan starts at the top left edge, scans across that single line, and then moves onto the next line (either line 2,3,4...480 etc if progressive or line 3, 5, 7...479,2,4,6...480 if interlaced).

The illusion breaks on an LCD because the tech works in a completely different way. They update every changed pixel at once, and also don't update pixels if they do not change. So when they receive an analogue image, they have to 'pretend' to be a CRT, emulating the way it creates a stable picture with that kind of image source.

Ultimately we're dealing with an analogue/digital problem. It's expensive and complicated for a computer to emulate the quirks of a technology invented in 1897, and a standard version of that technology created in 1941. Particularly for games which relied on specific quirks of CRTs to display the way they did.
 
240p was a hack invented by game console developers to get a stable image onto screens while using less bandwidth. It sends the TV (which is expecting 480i) only half of the lines, and the other half remain blank. This creates the black lines referred to as 'scalines' (though in reality they are 'unscanned lines'). It's the standard output of every console from Ataris until N64.Some games on later consoles still used it, the last console supporting it being the Wii.

240p uses the same bandwidth at 480i. Exact same number of lines sent over the wire, just some parts of your screen never really get lit up. It's considered a hack because it uses the exact same analog signal except removes the pulses during vblank that tells the electron gun to shift scanlines every other frame. It works so consistently on CRTs due to how specifically tied analog video is tied to the way CRTs operate.

Seems like this was mostly done so that they didn't need to draw as much stuff on the screen.
 

D.Lo

Member
240p uses the same bandwidth at 480i. Exact same number of lines sent over the wire, just some parts of your screen never really get lit up. It's considered a hack because it uses the exact same analog signal except removes the pulses during vblank that tells the electron gun to shift scanlines every other frame. It works so consistently on CRTs due to how specifically tied analog video is tied to the way CRTs operate.

Seems like this was mostly done so that they didn't need to draw as much stuff on the screen.
Yep correct, I meant it was a way of creating a stable picture with the lower bandwidth (internal bandwidth I guess), which is another way of saying 'draw less stuff on the screen'.

They could have rendered internally at 480p, then interlaced it at output. Indeed this is what 480i consoles do, and is the more logical way. That would have created a more normal picture, without scanlines (which created games' unique look, but must have looked very weird to people when first introduced). But the systems didn't have the video juice to do that, so they found a hack that let them make a picture work within limitations.
 

Madao

Member
if the N64 had to render at 480, all games would have looked like F-Zero X. they got lucky on that department.

also, it's crazy how 480 was standard for so long and how much resolutions have increased in just 10-15 years. looking at the big picture, it looks like they're advancing too fast now.
 

Sixfortyfive

He who pursues two rabbits gets two rabbits.
if the N64 had to render at 480, all games would have looked like F-Zero X. they got lucky on that department.

also, it's crazy how 480 was standard for so long and how much resolutions have increased in just 10-15 years. looking at the big picture, it looks like they're advancing too fast now.

Closer to 25 years than 15 years if you're including common PC standards.
 

D.Lo

Member
That's why I'm against any too-soon push for 4K - TV broadcasts and even the latest game consoles don't even output 1080p yet (consistently). What's the point of going higher when nothing even uses all the existing pixels yet?

The first consoles to finally even use all pixels on SDTVs were also the last to be made for SDTVs. Ridiculous, but perhaps just bad timing.
 

televator

Member
That's why I'm against any too-soon push for 4K - TV broadcasts and even the latest game consoles don't even output 1080p yet (consistently). What's the point of going higher when nothing even uses all the existing pixels yet?

The first consoles to finally even use all pixels on SDTVs were also the last to be made for SDTVs. Ridiculous, but perhaps just bad timing.

Yeah I'm hoping more for variable refresh rates adopted by console and TV manufacturers than I am looking forward to 4k.
 

chaosblade

Unconfirmed Member
Take a look at these guides: https://forum.speeddemosarchive.com/post/complete_creating_sda_acceptable_videos_ezcap_guide_.html

The first one specifically has the info that you're looking for. I went through setting this stuff up just yesterday actually. I ended up getting the Dazzle DVC 100 pretty cheap on ebay, and my PVM has A/V out so I didn't have to worry about splitting the signal, but here's a link for that: https://kb.speeddemosarchive.com/Splitters

You'll have to have a PC nearby to hookup the capture card to, as well as run AmaRecTV or whatever software you end up using.

I remember seeing that XCAPTURE-1 card when I was searching on how to set this stuff up, but I couldn't justify spending around $300 when I was just getting into recording stuff.

Wait, so cables with composite + s-video send the signal through both? I have the Monoprice cable, if that works maybe I won't need a splitter. I already have a couple RCA Y cables for audio.

That just leaves a capture card. And I realized I still have my super old PC that had a composite capture card in it. It was really laggy, but if I am splitting the signal it doesn't matter. Not sure the PC itself works, but I can use the card.
 

Bodacious

Banned
I think it would be like "NTSC Saturn>>>SCART with boosted sync>>>FM>>>HDMI"
Above is just an example, I don't know whether Saturn needs boosted sync or not. But one could probably compile a list of the 3 or so versions of each console in each region and account for which cable you need whether you're connecting to FM or BNC monitor inputs. The thing that freaks me out is when retro_console_accessories has 2 cables for the same system, I'm like ok which one is it?

That's what I was getting at, yeah.
 
Saturn does not need boosted sync.

The only consoles I have ever heard of needing boosted sync are 32X, Master System and Mega Drive, and as I found out Mark III as well (though it needs all kinds of additional external circuitry).

Didn't even know someone sold a boosted sync Saturn cable. Only ones I've seen were for the Genesis.

Anyway, I've just stuck to what Sixfortyfive just said with all my sync. Csync if the console does it and sync on luma if it doesn't. Don't think I have any systems that don't take either of those, and the only ones I got a luma cable for are PS1 and N64.
 
Just read up a little on sync.

- If your console has a CSYNC pin, get a plain CSYNC cable.
- If it doesn't have CSYNC but does have Luma, get a sync-on-luma cable.
- If it doesn't have either of those, get a sync-on-composite-video cable
It's just a lot to keep track of when you add the possibility of different model versions of the same console potentially having different pin outs or signal. I want to be able to have a vendor like retro_console_accessories have this figured out on their products for me.
 
It's just a lot to keep track of when you add the possibility of different model versions of the same console potentially having different pin outs or signal. I want to be able to have a vendor like retro_console_accessories have this figured out on their products for me.

On her storefront she will have them separated by system. From there it is normally a choice between jp21 and scart for the type and then various syncs depending on the system. Some systems that have a different pinout depending on the region, like the Saturn, will also have different cable sets for euro and NA.

What are the systems you are trying to hook up? Perhaps we can help tell you the proper cable for each.
 
What are the systems you are trying to hook up? Perhaps we can help tell you the proper cable for each.
Mostly the 16 bit up to Dreamcast systems connected to a Framemeister or BNC monitor. I think I can figure it out but some systems, like the different SNES's, have me worried.
 
Mostly the 16 bit up to Dreamcast systems connected to a Framemeister or BNC monitor. I think I can figure it out but some systems, like the different SNES's, have me worried.

What is the region of your systems, along with revision for the ones that that is applicable (IE Genesis model 1 or 2).

I'm about to go to bed so will not be able to answer you right away, but if someone else doesn't I will get to it when I wake up. I also have no experience with what works best for the Framemeister, so will only be able to recommend for the old CRTs.
 
What is the region of your systems, along with revision for the ones that that is applicable (IE Genesis model 1 or 2
I am actually still collecting consoles. I have a Genesis 3 but am probably going to get a 1. Same thing with SNES, I am going to pick up a 1-chip on eBay. I believe my Saturn is the model 2. I have a PS2 that I currently have connected via component but wonder if PS1 games would be better with SCART out of a PS1. Then there's the DC and I am pretty ignorant on that console, about which one to get.
 

Khaz

Member
It's just a lot to keep track of when you add the possibility of different model versions of the same console potentially having different pin outs or signal. I want to be able to have a vendor like retro_console_accessories have this figured out on their products for me.

Unless your display is finicky about the sync you're using, there is no difference in picture quality between csync and video sync. According to this thread, the xrgb doesn't need csync. No european TV will ever need csync. Only a handful of PVM may need csync, and in this specific case you can just add a sync stripper / booster in your Scart-to-BNC adapter and be done with it.

Only two problems can arise with the Composite Video part of video sync:
- your display is confused by the additional signals, results in dropped frames or rolling screen. Renders your screen unreadable.
- your display uses the Composite Video signal instead of RGB for some reason. It will look like shit because Composite is shit.

If you have no problem with your current video sync cables, you will gain nothing by switching to csync cables. Sync is just a ping to tell the display a new frame is coming. if this signal is messed up, your image is completely fucked. It's literally all or nothing. Composite video in Scart Sync was designed as a backup signal for displays that couldn't use RGB for some reason, but RGB just uses the sync par of video sync.

You should be more worried about your sound quality, every cheap cable with non-shielded wires will have audio buzz due to electric leakage between wires. Either get the option from retro_console_accessories, or use the Audio out featured on some consoles (Mega CD, early PS1; others can be modded). It will also increase your theoretical image quality, but you won't see the difference because your eyes can't distinguish a 1% hue difference.
 
Apparently FBX is doing PS2 profiles next. Hopefully he'll blow mine out of the water.

**Re: The above, I don't think this is necessarily true. PSX/PS2 output via composite video as sync has been known to have issues that don't completely destroy your image, instead giving you a checkerbox pattern.
 

Khaz

Member
Checkerbox pattern sounds like Composite Video. One should test with a Scart to Composite adapter to see if they have the same image. In my opinion it's just an illustration of the second problem in my list. There is no reason a messed up sync signal would be able to make such a small change in image quality.

I know I do have a problem with my PSone, which seems to output Composite a fraction of seconds before the pins 8 or 16 gets +5V, or something, resulting in the TV switching on but displaying Composite. Only plugging the cable with the console already on (I'm using a manual Scart switch to simulate this) am I able to get RGB. I suspect if I used a Sync-on-luma cable would I only have a black and white picture (full Luma signal) before the switch. I'm considering either an internal sync mod or a sync stripper, depending on how my Guncon will react to sync. My PStwo slim has no problem of this sort however.
 
Checkerbox pattern sounds like Composite Video. One should test with a Scart to Composite adapter to see if they have the same image. In my opinion it's just an illustration of the second problem in my list. There is no reason a messed up sync signal would be able to make such a small change in image quality.

I know I do have a problem with my PSone, which seems to output Composite a fraction of seconds before the pins 8 or 16 gets +5V, or something, resulting in the TV switching on but displaying Composite. Only plugging the cable with the console already on (I'm using a manual Scart switch to simulate this) am I able to get RGB. I suspect if I used a Sync-on-luma cable would I only have a black and white picture (full Luma signal) before the switch. I'm considering either an internal sync mod or a sync stripper, depending on how my Guncon will react to sync. My PStwo slim has no problem of this sort however.
AFAIK it isn't composite video. I'm not sure how the framemeister decides which video signal to take over RGB but I was under the impression that composite video as sync would not cause the issues present in CRTs in that it would also use RGB over composite via scart/jp21
 

Khaz

Member
It will take Composite from videosync if Scart pins 8 and/or 16 are faulty. That is, if it is Scart compliant.

Does your checkerboard pattern look like this? (Comparison between Svideo/good and Composite/bad)


From http://www.micro-64.com/features/svideo.shtml

I don't know about XRGBs behaviour, I was under the impression reading the thread that they had no problem with video sync. If they do, then it would be interesting to add a sync stripper in the Scart-to-din adapter. And stop worrying about buying a Scart cable with the correct sync.

off topic svideo
It seems TVs can use Composite when fed through their Svideo port (Luma pin), and can apparently be confused by it and display Composite instead of extracting the Luma part of the signal to use with the Chroma pin. But I know nothing about Svideo standard.
 
It will take Composite from videosync if Scart pins 8 and/or 16 are faulty. That is, if it is Scart compliant.

Does your checkerboard pattern look like this? (Comparison between Svideo/good and Composite/bad)



From http://www.micro-64.com/features/svideo.shtml

I don't know about XRGBs behaviour, I was under the impression reading the thread that they had no problem with video sync. If they do, then it would be interesting to add a sync stripper in the Scart-to-din adapter. And stop worrying about buying a Scart cable with the correct sync.

off topic svideo
It seems TVs can use Composite when fed through their Svideo port (Luma pin), and can apparently be confused by it and display Composite instead of extracting the Luma part of the signal to use with the Chroma pin. But I know nothing about Svideo standard.

yes it does. I guess it is composite, then. Wonder if I can rewire it and fix the issue...
 
I'm having a problem with the Framemeister where I can't get any scan lines on any picture mode. I downloaded the current firmware. What am I doing wrong?
 
I'm having a problem with the Framemeister where I can't get any scan lines on any picture mode. I downloaded the current firmware. What am I doing wrong?

If you walk me through what you're doing trying to get them I may be able to help you out. Hard to say with just the info provided.
 
Status
Not open for further replies.
Top Bottom