isn't the GC's "digital" output still analog since component is within that or is the conversion done inside the cable itself?
The chip inside the cable handles digital to analog conversion.
isn't the GC's "digital" output still analog since component is within that or is the conversion done inside the cable itself?
The chip inside the cable handles digital to analog conversion.
which, if I understand correctly, is why you can mod the cable in to basically whatever you like. you can toggle between YPbPr and RGB on the chip.
It does not deinterlace. It's only function is line doubling.. or tripling. It does not have an adequate frame buffer to do 2 field motion adaptive deinterlacing.
take a picture/video. It's really difficult to help you out when "flickering" is all we have to go on.
I see, I remember seeing that it did do deinterlacing, just not the method that the framemeister uses. I think it was single field, good for lag but a compromise on image quality I take it? never seen either first-hand.
Yeah, no fancy up scaling, but have a nice frame buffer available for movie quality deinterlacing.
I'll pay 300-400 for such a thing.
It would be a big change in direction for them to make devices that are so sophisticated. Currently all of their products are relatively simple, passive for the most part.
The chip inside the cable handles digital to analog conversion.
i see.
man, the GC is such a weird console when it comes to add ons. it seems like most of these things would have fit inside the console if they had bothered to design everything inside instead of making it like a mini Sega Genesis with tons of expansions.
especially the component cables. i wonder what was stopping them from doing a system like the Wii's multi out which can do component+older signals. keeping the same connector from SNES on GC wasn't worth it if that was the reason they didn't have a single connector.
on a different note, i still wonder what was supposed to go in the serial port 2. they never revealed anything that used that. it's up there with the Wii U's Gamepad expansion connector in the mystery connectors.
It's all due to Nintendo's desire to turn a profit on every console sold. They cut absolutely everything unnecessary.i see.
man, the GC is such a weird console when it comes to add ons. it seems like most of these things would have fit inside the console if they had bothered to design everything inside instead of making it like a mini Sega Genesis with tons of expansions.
especially the component cables. i wonder what was stopping them from doing a system like the Wii's multi out which can do component+older signals. keeping the same connector from SNES on GC wasn't worth it if that was the reason they didn't have a single connector.
on a different note, i still wonder what was supposed to go in the serial port 2. they never revealed anything that used that. it's up there with the Wii U's Gamepad expansion connector in the mystery connectors.
Can't say that I do. But if you could get some close up shots of it running done video games, I'd love to seeSo I picked up an old IBM 5153 monitor at a thrift store for 20 bucks and was wondering if it was decent to run some nes stuff on. If not that's cool. Probably hook a Pi up to it and mess around. Has a really great aesthetic.
Looks like this:
Anyone know anything about these?
it's still hard to tell, but this looks like the noise I was getting when I swapped to RGB on my PS2. change your A/D a bit north, and mess with black level and such. If it's more than like... weird 'waves' of lighter color like it seems, then it might be an issue with your hardware at some point.Ok, I tried to take some video of the flickering. Its pretty hard to capture, but Ive done my best with my crappy mobile camera. All three videos are from the Xbox Main menu where you set the clock. In the first video, you should be able to see how the image flickers (it is much worse in person) if you take full screen. Disregard the fact that i cant stand still lol: https://youtu.be/v2IA8LZYRJA
The second video is from the other side of the clock where it is slightly less noticeable: https://youtu.be/fHf0k0OO1HU
And the third is from after the screen turns black, which the xbox does after having not been used for a while. In person, it is by far easiest to recognize it when the screen is black, however, my crappy camera also have the hardest time capturing it at that point. So disregard the reflection in the tv on the right side which appeared since i had to use some light to get the video, however, on the left side, it should be noticeable. The blue stuff that flickers is actually black on the tv, but got blue in the video because i tried to direct light at it (and since the clock slightly below is green): https://youtu.be/lqGBsxoRPrs
It's all due to Nintendo's desire to turn a profit on every console sold. They cut absolutely everything unnecessary.
If that digital-to-analog chip in the component cable cost a couple cents.... it was cut. Make the 0.5% of progressive scan crazies in the early 2000s pay for the cost.
it's still hard to tell, but this looks like the noise I was getting when I swapped to RGB on my PS2. change your A/D a bit north, and mess with black level and such. If it's more than like... weird 'waves' of lighter color like it seems, then it might be an issue with your hardware at some point.
Actually, the chip in the cable was an excellent high quality one that would have cost a few dollars.If that digital-to-analog chip in the component cable cost a couple cents.... it was cut. Make the 0.5% of progressive scan crazies in the early 2000s pay for the cost.
The Wii included a component DAC on board and it was an inferior one, so we were stuck with worse video there. If it was done like the Gamecube, the Wii could possibly have had a real digital HDMI adapter.
Has anyone taken the time to give a rough schematic of the Wii board? I'm kinda curious if the dac input could be tapped to make an HDMI out similar to the GC.
Like, I'm actually pretty happy with the way my Wii looks (the softness of 480p is just fine by me) but as an intellectual endeavor.
GARO preorders are up.
It's an component to RGBHV (VGA or Euro SCART) transcoder that works from 240p to 1080p (and VGA timings up to 1600x1200). It can add "scanlines" to 480p and higher also. It is not a scaler. It doesn't buffer frames and hence does not add lag.
http://www.beharbros.com/#!garo/cnwf
I'm pretty sure Unseen found a way to do this just a little while ago.
Unseen said:Based on a few preliminary measurements, the Wii seems to use the same video data format internally, but with 1.8V levels instead of 3.3V. Since the Wii already has a component output, connecting this board to it probably isn't worth the hassle unless you've fried our video output. It also seems that the Wii internally transports audio using I2S, so why isn't there an SPDIF mod for it yet?
Does anyone know if the JVC TM-A210G has extra input cards to get RGB? Two are being given away but they only have composite and s-video as of now.
H & V size looks too big to me.
Are you only connecting one system?
Of the remaining 5, everything looked beautiful besides N64 and Gamecube. N64 looks like complete garbage despite using the same SCART cable as the SNES. I never realized how reliant N64 is on scanlines to mask different things. Gamecube looks pretty good but nowhere near as nice as it is on the PVM.
Here is a video I recorded of the different systems being projected:
https://www.youtube.com/watch?v=2DcUD0gMwyE
I am on a 110" Projector Screen and now I have my first world problems dilemma. Should I keep using my PVM 20L5 for gaming or the projector?
wonder if that mod would tighten up the Wii's soft component picture, at least.
Interesting about the 64. I need to get a gameshark and try out those filter removals also. I think you are right about using this for guests. Either way, Gamecube, N64, and Dreamcast will stay hooked up to the PVM. NES, SNES, GEN, and Saturn look close enough to the PVM that I'll probably keep them on the projector for now since it's such a big picture vs a 19" screen.I say use the PVM. It generally looks better. Use the projector if you have guests over for more comfortable multiplayer.
N64.... I think a nice aperture grille or even better in this case, shadowmask, with a soft picture is the main factor in masking the system's graphical quirks. The more defined scanlines appear, the uglier the graphics look. I dunno why, but phonedork noticed sharper scanlines have the (unintended) effect of revealing more of the N64's ugly blur filter. I completely agree from what I've seen myself.
Some GameCube games look soooo gooood in 240p
Yeah, I can imagine
I'm using my GC exclusively in forced 240p on my JVC as I prefer the lower res progressive display to the strobing of 480/576i. Some games look very off, but some are stunning
Leaving the red boxes slightly out is actually fine and in some cases preferable if you want a one setting fits all approach to all your consoles and not having to recalibrate each time.
super Mega bummer. Would it be worth trying this converter http://www.keene.co.uk/keene-rgb-to-s-video-convertor-uk-psu.html for a monitor that only accepts s-video? It would be almost $100 with shipping so I'm leaning towards no, even if the monitors are free.There are no optional card slots on that model. It's stuck with composite and s-video.
Yeah, I can imagine
I'm using my GC exclusively in forced 240p on my JVC as I prefer the lower res progressive display to the strobing of 480/576i. Some games look very off, but some are stunning
I notice a large shift in the display area going from Sega CD to Saturn to SNES on my setup if I extend the borders past the screen. I can't see how having the image eclipsing the edges of the screen preferable to having the entire image displayed.
You're literally throwing out half of the game's picture by forcing 240p on games that are meant to be played at 480i and 480p. The tradeoff is not worth it.
You're literally throwing out half of the game's picture by forcing 240p on games that are meant to be played at 480i and 480p. The tradeoff is not worth it.
It is when the alternative is head-splitting migraines. I literally cannot cope with the strobing that interlaced displays cause - they've always given me headaches, but back in the day I coped with it. I can't now.
Anyhow, it's just a fun side distraction. I play all of my gamecube games on Nintendont and dolphin.
De-interlacing gets a bad rap.
Like yeah it's not good to do it on 240p sources but for 480i content when done well it can have low latency and the output can look very nice.
Actually, the chip in the cable was an excellent high quality one that would have cost a few dollars.
And it isn't a component out port, it's a full digital out port. including audio. It's IMO an elegant, modular design. It could be used for component in the way they did it, and that could be upgraded to any digital connection that became common on TVs later.
In the end it wasn't popular, no digital TV standard developed over the console's lifetime, and like you said, with the modular design they could let the RGB monitor crazies pay for what is effectively an upgraded console through the cable price.
The Wii included a component DAC on board and it was an inferior one, so we were stuck with worse video there. If it was done like the Gamecube, the Wii could possibly have had a real digital HDMI adapter.
I agree with you 110%, but who were you talking to? lol
wonder if that mod would tighten up the Wii's soft component picture, at least.
You need to do a guide on how to photograph CRTs, Mega. That looks great!
I'm actually going to start shooting CRTs for an art project. I'm going to shoot RAW instead of JPG. No worries about white balance when you can always change it later! Also going to start using a tripod and remote shutter release.