• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Upscalers, CRTs, PVMs & RGB: Retro gaming done right!

Status
Not open for further replies.

televator

Member
which, if I understand correctly, is why you can mod the cable in to basically whatever you like. you can toggle between YPbPr and RGB on the chip.

Yes, that is the primary function of the processor. A DAC. Problem was it was quite a unique DAC for an apparently unique digital signal.
 
So I picked up an old IBM 5153 monitor at a thrift store for 20 bucks and was wondering if it was decent to run some nes stuff on. If not that's cool. Probably hook a Pi up to it and mess around. Has a really great aesthetic.

Looks like this:
s-IBM-CGA-Display-5153-1981-1986-648x500.jpg

Anyone know anything about these?
 

Vespa

Member
It does not deinterlace. It's only function is line doubling.. or tripling. It does not have an adequate frame buffer to do 2 field motion adaptive deinterlacing.

I see, I remember seeing that it did do deinterlacing, just not the method that the framemeister uses. I think it was single field, good for lag but a compromise on image quality I take it? never seen either first-hand.
 

Morfeo

The Chuck Norris of Peace
take a picture/video. It's really difficult to help you out when "flickering" is all we have to go on.

Ok, I tried to take some video of the flickering. Its pretty hard to capture, but Ive done my best with my crappy mobile camera. All three videos are from the Xbox Main menu where you set the clock. In the first video, you should be able to see how the image flickers (it is much worse in person) if you take full screen. Disregard the fact that i cant stand still lol: https://youtu.be/v2IA8LZYRJA

The second video is from the other side of the clock where it is slightly less noticeable: https://youtu.be/fHf0k0OO1HU

And the third is from after the screen turns black, which the xbox does after having not been used for a while. In person, it is by far easiest to recognize it when the screen is black, however, my crappy camera also have the hardest time capturing it at that point. So disregard the reflection in the tv on the right side which appeared since i had to use some light to get the video, however, on the left side, it should be noticeable. The blue stuff that flickers is actually black on the tv, but got blue in the video because i tried to direct light at it (and since the clock slightly below is green): https://youtu.be/lqGBsxoRPrs
 

televator

Member
I see, I remember seeing that it did do deinterlacing, just not the method that the framemeister uses. I think it was single field, good for lag but a compromise on image quality I take it? never seen either first-hand.

That is correct. It uses a single field (half resolution) to produce one frame and then the other field for another frame. It's a hit to image quality.

Deinterlacing is a method of restoring full vertical resolution from both fields via temporal realignment to comprise one complete frame.
 
Yeah, no fancy up scaling, but have a nice frame buffer available for movie quality deinterlacing.

I'll pay 300-400 for such a thing.

It would be a big change in direction for them to make devices that are so sophisticated. Currently all of their products are relatively simple, passive for the most part.
 

Madao

Member
The chip inside the cable handles digital to analog conversion.

i see.

man, the GC is such a weird console when it comes to add ons. it seems like most of these things would have fit inside the console if they had bothered to design everything inside instead of making it like a mini Sega Genesis with tons of expansions.

especially the component cables. i wonder what was stopping them from doing a system like the Wii's multi out which can do component+older signals. keeping the same connector from SNES on GC wasn't worth it if that was the reason they didn't have a single connector.

on a different note, i still wonder what was supposed to go in the serial port 2. they never revealed anything that used that. it's up there with the Wii U's Gamepad expansion connector in the mystery connectors.
 

televator

Member
i see.

man, the GC is such a weird console when it comes to add ons. it seems like most of these things would have fit inside the console if they had bothered to design everything inside instead of making it like a mini Sega Genesis with tons of expansions.

especially the component cables. i wonder what was stopping them from doing a system like the Wii's multi out which can do component+older signals. keeping the same connector from SNES on GC wasn't worth it if that was the reason they didn't have a single connector.

on a different note, i still wonder what was supposed to go in the serial port 2. they never revealed anything that used that. it's up there with the Wii U's Gamepad expansion connector in the mystery connectors.

Saving $$$. That's it really. Why spend money putting in a DAC that will only fully benefit people with progressive scan monitors back then?

Far be it from me to defend Nintendo practices as a consumer though. lol I'm not about that life.
 

BocoDragon

or, How I Learned to Stop Worrying and Realize This Assgrab is Delicious
i see.

man, the GC is such a weird console when it comes to add ons. it seems like most of these things would have fit inside the console if they had bothered to design everything inside instead of making it like a mini Sega Genesis with tons of expansions.

especially the component cables. i wonder what was stopping them from doing a system like the Wii's multi out which can do component+older signals. keeping the same connector from SNES on GC wasn't worth it if that was the reason they didn't have a single connector.

on a different note, i still wonder what was supposed to go in the serial port 2. they never revealed anything that used that. it's up there with the Wii U's Gamepad expansion connector in the mystery connectors.
It's all due to Nintendo's desire to turn a profit on every console sold. They cut absolutely everything unnecessary.

If that digital-to-analog chip in the component cable cost a couple cents.... it was cut. Make the 0.5% of progressive scan crazies in the early 2000s pay for the cost.

I found the lack of digital sound output more alarming in the day... Nintendo didn't mess with that either.
 

Conezays

Member
So my Component to BNC connectors came which I'm using on my PVM1354Q. Besides sticking the cables in the RGB/Component slots (duh), is there anything else I should know regarding adjustments, etc? Right off the bat I noticed the picture being much brighter than my Saturn via S-video, allowing me to turn down the contrast and brightness. So far I think it looks great, although I'm not sure I'm seeing a huge difference between my Toshiba via component cables and this PVM. Any tips are much appreciated.

Edit: And it was plugged into the video outputs, not inputs. Ugh, I need to sleep.
 

Peltz

Member
So I picked up an old IBM 5153 monitor at a thrift store for 20 bucks and was wondering if it was decent to run some nes stuff on. If not that's cool. Probably hook a Pi up to it and mess around. Has a really great aesthetic.

Looks like this:


Anyone know anything about these?
Can't say that I do. But if you could get some close up shots of it running done video games, I'd love to see :)
 
Ok, I tried to take some video of the flickering. Its pretty hard to capture, but Ive done my best with my crappy mobile camera. All three videos are from the Xbox Main menu where you set the clock. In the first video, you should be able to see how the image flickers (it is much worse in person) if you take full screen. Disregard the fact that i cant stand still lol: https://youtu.be/v2IA8LZYRJA

The second video is from the other side of the clock where it is slightly less noticeable: https://youtu.be/fHf0k0OO1HU

And the third is from after the screen turns black, which the xbox does after having not been used for a while. In person, it is by far easiest to recognize it when the screen is black, however, my crappy camera also have the hardest time capturing it at that point. So disregard the reflection in the tv on the right side which appeared since i had to use some light to get the video, however, on the left side, it should be noticeable. The blue stuff that flickers is actually black on the tv, but got blue in the video because i tried to direct light at it (and since the clock slightly below is green): https://youtu.be/lqGBsxoRPrs
it's still hard to tell, but this looks like the noise I was getting when I swapped to RGB on my PS2. change your A/D a bit north, and mess with black level and such. If it's more than like... weird 'waves' of lighter color like it seems, then it might be an issue with your hardware at some point.
 
It's all due to Nintendo's desire to turn a profit on every console sold. They cut absolutely everything unnecessary.

If that digital-to-analog chip in the component cable cost a couple cents.... it was cut. Make the 0.5% of progressive scan crazies in the early 2000s pay for the cost.


The problem is if you include every component built into the console you end up with a $599 launch price. That worked well
 

Morfeo

The Chuck Norris of Peace
it's still hard to tell, but this looks like the noise I was getting when I swapped to RGB on my PS2. change your A/D a bit north, and mess with black level and such. If it's more than like... weird 'waves' of lighter color like it seems, then it might be an issue with your hardware at some point.

Thanks, I know its hard to make out from those videos, but I appreciate your effort alot! Will try your suggestions. I also sent the same to Solaris for them to evaluate.
 

D.Lo

Member
If that digital-to-analog chip in the component cable cost a couple cents.... it was cut. Make the 0.5% of progressive scan crazies in the early 2000s pay for the cost.
Actually, the chip in the cable was an excellent high quality one that would have cost a few dollars.

And it isn't a component out port, it's a full digital out port. including audio. It's IMO an elegant, modular design. It could be used for component in the way they did it, and that could be upgraded to any digital connection that became common on TVs later.

In the end it wasn't popular, no digital TV standard developed over the console's lifetime, and like you said, with the modular design they could let the RGB monitor crazies pay for what is effectively an upgraded console through the cable price.

The Wii included a component DAC on board and it was an inferior one, so we were stuck with worse video there. If it was done like the Gamecube, the Wii could possibly have had a real digital HDMI adapter.
 

SegaShack

Member
So about a month ago I got an HD Home Theater Projector (BenQ 2050) for my living room. I keep my classic consoles/PVM in another room in my house. The projector has a VGA port on it and today, for the hell of it I wondered if RGB going through my sync strike through VGA would work. Now for the record I am aware that most VGA inputs do not accept 15hz, which is what RGB is. Now to my amazement this thing worked flawlessly. The games looked great too once I got the settings right.

I mean just take a look at this:

Don't get me wrong, games do look a bit nicer on the PVM, most likely due to the scan lines. I tested NES, SNES, Genesis, Saturn, Gamecube (Component), Dreamcast, and N64. Dreamcast wasn't recognized over RGB for some reason. Of the remaining 5, everything looked beautiful besides N64 and Gamecube. N64 looks like complete garbage despite using the same SCART cable as the SNES. I never realized how reliant N64 is on scanlines to mask different things. Gamecube looks pretty good but nowhere near as nice as it is on the PVM.

Here is a video I recorded of the different systems being projected:

https://www.youtube.com/watch?v=2DcUD0gMwyE

I am on a 110" Projector Screen and now I have my first world problems dilemma. Should I keep using my PVM 20L5 for gaming or the projector?
 
The Wii included a component DAC on board and it was an inferior one, so we were stuck with worse video there. If it was done like the Gamecube, the Wii could possibly have had a real digital HDMI adapter.

Has anyone taken the time to give a rough schematic of the Wii board? I'm kinda curious if the dac input could be tapped to make an HDMI out similar to the GC.

Like, I'm actually pretty happy with the way my Wii looks (the softness of 480p is just fine by me) but as an intellectual endeavor.
 
Has anyone taken the time to give a rough schematic of the Wii board? I'm kinda curious if the dac input could be tapped to make an HDMI out similar to the GC.

Like, I'm actually pretty happy with the way my Wii looks (the softness of 480p is just fine by me) but as an intellectual endeavor.

I'm pretty sure Unseen found a way to do this just a little while ago.
 
GARO preorders are up.

It's an component to RGBHV (VGA or Euro SCART) transcoder that works from 240p to 1080p (and VGA timings up to 1600x1200). It can add "scanlines" to 480p and higher also. It is not a scaler. It doesn't buffer frames and hence does not add lag.

http://www.beharbros.com/#!garo/cnwf

I use a Framemeister and run my retro consoles via RGB SCART through 2 gscartsw switches. Right now I have the Xbox, ps2, Wii, PSP and GameCube hooked up via a component switch run through the d-terminal input on the Framemeister. Would there be any advantage to using the GARO for these consoles over the D-terminal in?
 
I'm pretty sure Unseen found a way to do this just a little while ago.

Ahh:
Unseen said:
Based on a few preliminary measurements, the Wii seems to use the same video data format internally, but with 1.8V levels instead of 3.3V. Since the Wii already has a component output, connecting this board to it probably isn't worth the hassle unless you've fried our video output. It also seems that the Wii internally transports audio using I2S, so why isn't there an SPDIF mod for it yet?
 

Mega

Banned
Does anyone know if the JVC TM-A210G has extra input cards to get RGB? Two are being given away but they only have composite and s-video as of now.

There are no optional card slots on that model. It's stuck with composite and s-video.

H & V size looks too big to me.

Are you only connecting one system?

Leaving the red boxes slightly out is actually fine and in some cases preferable if you want a one setting fits all approach to all your consoles and not having to recalibrate each time.

Of the remaining 5, everything looked beautiful besides N64 and Gamecube. N64 looks like complete garbage despite using the same SCART cable as the SNES. I never realized how reliant N64 is on scanlines to mask different things. Gamecube looks pretty good but nowhere near as nice as it is on the PVM.

Here is a video I recorded of the different systems being projected:

https://www.youtube.com/watch?v=2DcUD0gMwyE

I am on a 110" Projector Screen and now I have my first world problems dilemma. Should I keep using my PVM 20L5 for gaming or the projector?

I say use the PVM. It generally looks better. Use the projector if you have guests over for more comfortable multiplayer.

N64.... I think a nice aperture grille or even better in this case, shadowmask, with a soft picture is the main factor in masking the system's graphical quirks. The more defined scanlines appear, the uglier the graphics look. I dunno why, but phonedork noticed sharper scanlines have the (unintended) effect of revealing more of the N64's ugly blur filter. I completely agree from what I've seen myself.
 

SegaShack

Member
I say use the PVM. It generally looks better. Use the projector if you have guests over for more comfortable multiplayer.

N64.... I think a nice aperture grille or even better in this case, shadowmask, with a soft picture is the main factor in masking the system's graphical quirks. The more defined scanlines appear, the uglier the graphics look. I dunno why, but phonedork noticed sharper scanlines have the (unintended) effect of revealing more of the N64's ugly blur filter. I completely agree from what I've seen myself.
Interesting about the 64. I need to get a gameshark and try out those filter removals also. I think you are right about using this for guests. Either way, Gamecube, N64, and Dreamcast will stay hooked up to the PVM. NES, SNES, GEN, and Saturn look close enough to the PVM that I'll probably keep them on the projector for now since it's such a big picture vs a 19" screen.
 

Mega

Banned
I think it's easier to use an Everdrive. Stuff below was in MLiG's last video. https://www.youtube.com/watch?v=QDiHgKil8AQ

http://retrorgb.com/n64.html
- If you have an Everdrive 64 (or other ROM cart), it'll be easier to patch the roms [than to] type Gameshark codes. Poregon has created some great patches that can either be applied with LunerIPS, or by using the auto-patch feature of the Everdrive 64 (just make sure the patch has the exact same file name as the rom): http://n64.poregon.com/shared/

It makes a nice improvement btw. Highly recommended. Killing AA blur combined with the Ultra HDMI secondary blur removal and Sharp Pixels/integer scaling, you get a PS1-like picture.

On a related note, I contacted the Ultra HDMI creator and he is releasing a firmware update that will allow the use of the scanline filter with integer scaling. For those unaware, the default scaling option is blurry. The solution is to enable Sharp Pixels (integer scaling) but that grays out a bunch of stuff on the menu, including scanline filter options which are really needed to complete the look. Without scanlines it looks like a sharp, crawling pixelated mess (MLiG example below). He didn't think the two would visually work well together but became aware it was an issue after phonedork's last video that touched on the problem. The patch is 1-2 months away and because of a risk of bricking the console, it will only be given to mod installers. I was told I could eventually get it myself from the guy who sold me the Ultra HDMI N64 with an understanding of the risk involved. I get why he's doing it this way since it ties into how he primarily distributes the Ultra HDMI only to people who offer modding services.

movies-2016-6e24c.png
 

IrishNinja

Member
^that does look nice, i gotta work on patching those ROMs on my everdrive sometime

also, $250 xrgb-3 for sale on SHMUPs forums, i know some people dig that lagless model, not sure if the price is good or not but i don't see them very often

*edit aw man, boco's banned?
 

Rich!

Member
Yeah, I can imagine

I'm using my GC exclusively in forced 240p on my JVC as I prefer the lower res progressive display to the strobing of 480/576i. Some games look very off, but some are stunning
 
Leaving the red boxes slightly out is actually fine and in some cases preferable if you want a one setting fits all approach to all your consoles and not having to recalibrate each time.

I notice a large shift in the display area going from Sega CD to Saturn to SNES on my setup if I extend the borders past the screen. I can't see how having the image eclipsing the edges of the screen preferable to having the entire image displayed.
 

Mega

Banned
Yeah, I can imagine

I'm using my GC exclusively in forced 240p on my JVC as I prefer the lower res progressive display to the strobing of 480/576i. Some games look very off, but some are stunning

You're literally throwing out half of the game's picture by forcing 240p on games that are meant to be played at 480i and 480p. The tradeoff is not worth it.


I notice a large shift in the display area going from Sega CD to Saturn to SNES on my setup if I extend the borders past the screen. I can't see how having the image eclipsing the edges of the screen preferable to having the entire image displayed.

It's personal preference but if you don't set your CRT's horizontal size with enough overscan then you will get a shift that translates to a big empty space on one side. In my experience, setting it perfectly for NES and SNES will lead to Genesis and PC Engine games shifted right with an annoying gap on left AND certain games like Kirby and SMB3 will be the same.
 

Rich!

Member
You're literally throwing out half of the game's picture by forcing 240p on games that are meant to be played at 480i and 480p. The tradeoff is not worth it.

It is when the alternative is head-splitting migraines. I literally cannot cope with the strobing that interlaced displays cause - they've always given me headaches, but back in the day I coped with it. I can't now.

Anyhow, it's just a fun side distraction. I play all of my gamecube games on Nintendont and dolphin.
 

Vespa

Member
Anybody notice ringing on their Ps2's 480p output? I've just tested it in YUV and RGsB and see ringing in both colour spaces. 480i is fine strangely enough.

I'm thinking it might be the component cable. I know the Ps2 is notorious for having a bad component signal but I thought that was only in YUV.
 
You're literally throwing out half of the game's picture by forcing 240p on games that are meant to be played at 480i and 480p. The tradeoff is not worth it.

I'm totally with you on all 3D games rendered in 480p. Forcing it to 240p is like putting a piece of plastic with black stripes on your screen to block out half the image detail.
 
De-interlacing gets a bad rap.

Like yeah it's not good to do it on 240p sources but for 480i content when done well it can have low latency and the output can look very nice.
 

Mega

Banned
It is when the alternative is head-splitting migraines. I literally cannot cope with the strobing that interlaced displays cause - they've always given me headaches, but back in the day I coped with it. I can't now.

Anyhow, it's just a fun side distraction. I play all of my gamecube games on Nintendont and dolphin.

It's dependent on the display. 480i on my 13" JVC looks like 480p (also kind of true for 240p), perhaps because of how the phosphors are arranged on the shadowmask. I tried but can't detect the commonplace line-alternating flickering of all my other monitors. 480 lines displayed simultaneously below. How? I dunno. It's amazing tbh.

 

Madao

Member
Actually, the chip in the cable was an excellent high quality one that would have cost a few dollars.

And it isn't a component out port, it's a full digital out port. including audio. It's IMO an elegant, modular design. It could be used for component in the way they did it, and that could be upgraded to any digital connection that became common on TVs later.

In the end it wasn't popular, no digital TV standard developed over the console's lifetime, and like you said, with the modular design they could let the RGB monitor crazies pay for what is effectively an upgraded console through the cable price.

The Wii included a component DAC on board and it was an inferior one, so we were stuck with worse video there. If it was done like the Gamecube, the Wii could possibly have had a real digital HDMI adapter.

after reading this, it really looks like the GC was too ahead of its time in some areas. maybe that digital out was planned to become the standard connector for future consoles and the cables would do the trick but the expenses killed it. maybe if the GC sold like the PS2 that would have been realized.
 
wonder if that mod would tighten up the Wii's soft component picture, at least.

The Wii's component output isn't half as bad as people say, IMO, at least not on my launch Wii. Even the MLIG guys admitted it wasn't worth the price of Gamecube component over the Wii if it weren't for the Gameboy Player which is exclusive to the actual Gamecube.

Don't forget that if your Wii looks slightly washed out compared to a real Gamecube, you can calibrate your display for saturation and contrast to compensate. Many comparisons from people like phonedork and even RetroRGB make a point of NOT calibrating and saying "hey, look at this inferior output over here". Always calibrate.
 

Mega

Banned
You need to do a guide on how to photograph CRTs, Mega. That looks great!

Pretty basic stuff. Dim/no lights to cut reflections. Prime lens or focus about halfway to minimize barrel distortion. Shoot from about a foot back to avoid lens distortion.

Shutter speed at least 1/40 sec if you're shooting handheld, better if you use a tripod and you can go lower. Keeping the minimum shutter speed in mind: lowest possible ISO to retain image quality, down to details of individual phosphors.

Shoot manual and at an angle, even if slight. Your CRT probably has a curved tube... that + the glass screen on top + other CRT factors can throw off auto focus leading to blurry images. Shooting at an angle for auto focus helps because even if camera sensor is off it will still capture something in focus within the depth of field. Not the case if you shoot dead on.

Your camera will probably capture the scrolling refresh in the digital preview or in the final shot so shoot continuous and pick the best one where it doesn't show or is most minimal.

If your camera supports it, change white balance and saturation to match what you're seeing on your CRT. Otherwise may get a weird color cast and look muted. This varies a lot from one screen to the next. Nothing I've tried with a few cameras accurately captures the look of a backlit/bivert DMG and GBP.

Anyway just play around and if it looks good enough, you're done. I find like 99% of the pics people taken in the Scanlines thread to be perfectly good.
 
I'm actually going to start shooting CRTs for an art project. I'm going to shoot RAW instead of JPG. No worries about white balance when you can always change it later! Also going to start using a tripod and remote shutter release.
 
Status
Not open for further replies.
Top Bottom