It's called a CRTThere still is a pause, though. And it can apparently be exacerbated by switches, splitters, displays, capture cards, etc..
Honestly resolution changes are just a bummer in general. There isn't really a good solution that I'm aware of.
It's called a CRT
I guess I don't see why, if your TV can handle 480i well, you would even be using the OSSC or any upscaler. Because if it can handle 480i BETTER than the OSSC it can probably handle 240p just fine. Seems to me if your TV can handle 480i better than an upscaler, it's probably a CRT TV lol.I'm not entirely sure how the OSSC works, but it's entirely possible that it can automatically pass through 480i in some optional setting. I used RE2 as an example where that would be a handy feature. Segments in the game that are 240p would be line doubled and segments in the game that are 480i wouldn't be blurred by the OSSC. This way would achieve the best possible image quality.
Looking at what I'll actually use the PVM for it's best I go with the 20M4U anyways. For 480p+ content I prefer having it scale on a larger modern screen but for 240p I like the CRT look that a PVM/CRT has. Don't think, aside from the novelty, I'd get too much use out of running 480p content on a PVM.
It's more than just novelty. Unless you have good upscaling hardware, 480p Gamecube games and the entire Wii library will look better at native res on a 480p CRT. Even then a good CRT should still have the edge since there won't be scaling artifacts and HDTV lag. The properties of the CRT screen would also do a better job of masking low res textures and lack of AA.
Wii + HDTV is still worth it because many of the games are meant to be played about 6 feet back with Wii remote pointing and gestures... also 4+ multiplayer games. So a 14-20" CRT isn't always the best choice.
It's called a CRT
Wait...what? What is this sorcery?yeah, I think there was a video posted earlier in this thread where the MLiG guys showed their setups and the video went to a PVM CRT which has RGB output back to the capture device IIRC
You'd pay £200 for a PVM? Madness.
Though, I know the market has changed since I got my 20M4E for £20. lol
So I posted this in another topic but this is probably more appropriate here. I finally RGB modded my SNES mini and I hooked it up to my Sony PVM with my scart cables but I've encountered an issue. I can get audio and I can get video but not both at the same time. I've been trying to figure out either what I've done wrong or what settings I need to adjust but I haven't been able to figure it out.
Dude, look behind it
yes you can buy the remote separately, it will work if it's the right model.
Irish drunk posting again, eh? Bro look in your wallet. Your CRT model number is on your driver's license.
Cell phone cameras are thin and can probably easily fit behind the unit without rotating it. That's how I took a photo of my Trinitron's cable input panel, anyway.but its heavvyyyy
I guess I don't see why, if your TV can handle 480i well, you would even be using the OSSC or any upscaler. Because if it can handle 480i BETTER than the OSSC it can probably handle 240p just fine. Seems to me if your TV can handle 480i better than an upscaler, it's probably a CRT TV lol.
Are you able to test the cable with other consoles?
What are you doing differently to get audio only sometimes and video only other times?
So I'm currently at work but I was able to find my TV in Google image search.
EDIT:
So here's where I plugged in the video cables (the white arrow is actually representing a black cable):
As I was marking up the image of the back of the TV I started to realize that I may have plugged the audio into the wrong ports. There's a white audio cable and a red one....where should they be plugged in?
EDIT2:
I think I just realized that I have the black video cable plugged into the wrong spot. It should go into EXT SYNC IN, correct?
I was thinking in light of timus desire to record. CRTs don't help that unfortunately lol.It's called a CRT
Yeah, basically this.I was thinking in light of timus desire to record. CRTs don't help that unfortunately lol.
So here's where I plugged in the video cables (the white arrow is actually representing a black cable):
Preferably, you should be using an external sound setup anyway. I consider it to be an important part of the "retro" experience no matter if you use CRT or an HDTV. If you have an external amp or even a stereo system of some sort, use that instead.
You got your cables all wrong. You're basically switching between Composite video with sound (but you only have CSYNC in your cable so no video) and RGB without SYNC nor sound.
Everything goes into the bottom row. The top row is for encoded signals, Composite, S-Video. Put your red, green blue BNC where it's labelled R, G, and B, the Black BNC into EXT SYNC in, and either the white or red RCA cable into Audio in. Then press EXT SYNC, and depress both A/RGB and LINE/RGB.
I would strongly suggest to invest in an external audio solution, right now you will get shit sound, and only half of it.
It would be interesting for you to try and find a manual online for your monitor. It should make for an interesting read and should explain everything you need to know to make the most of your monitor.
It's more than just novelty. Unless you have good upscaling hardware, 480p Gamecube games and the entire Wii library will look better at native res on a 480p CRT. Even then a good CRT should still have the edge since there won't be scaling artifacts and HDTV lag. The properties of the CRT screen would also do a better job of masking low res textures and lack of AA.
It's more than just novelty. Unless you have good upscaling hardware, 480p Gamecube games and the entire Wii library will look better at native res on a 480p CRT. Even then a good CRT should still have the edge since there won't be scaling artifacts and HDTV lag. The properties of the CRT screen would also do a better job of masking low res textures and lack of AA.
Wii + HDTV is still worth it because many of the games are meant to be played about 6 feet back with Wii remote pointing and gestures... also 4+ multiplayer games. So a 14-20" CRT isn't always the best choice.
I've been thinking about it some more and I think I figured out what you were saying. That it would be a work around, while temporarily losing some picture quality, from that sync drop in game resolution switches. Maybe the TV would still drop sync though, going from a 480p to 480i input on the fly?What? Da fuck? I can't even... I guess all I can tell you at this point is that I've explained it to you, but in went in one ear and out the other. I feel if I explained it to you again, it would be in vein.
I don't have a dedicated upscaler to compare, but my Wii games really do look better on Wii + 480p CRT than Wii U + 1080p Panasonic plasma. It's a pretty big difference where the former looks clean and sharp, basically as perfect as can be outside of emulation. The latter looks kind of blurry and less appealing for lack of a better description. I don't mean on a personal taste level but an observable degraded picture that you can actually point to as having problems. This is where I suppose an upscaler would bring the comparison closer, but the TV on its own is certainly worse.
Maybe this weekend I'll take pictures and see if the difference can be captured by my camera. I got a tripod a couple weeks ago allowing me to use the lowest ISO settings (higher destroys fine photo details), so my recent photos have been as sharp, detailed and camera shake-free as possible.
I've been thinking about it some more and I think I figured out what you were saying. That it would be a work around, while temporarily losing some picture quality, from that sync drop in game resolution switches. Maybe the TV would still drop sync though, going from a 480p to 480i input on the fly?
On my plasma set I can't see a difference in Wii games whether they're running on Wii U 480p or 1080p settings. Repeatedly went back and forth the day you told me the plasma has better scaling capability than the Wii U, but I just couldn't see it. Not set to game mode either. I have recommended custom settings from a AVS forum thread about my TV.
The difference between your results and mine may be that my Panasonic is a year older model and possibly has a worse internal scaler than the following year's upgrade.
Yeah but what happens, hypothetically, on your TV screen when your HDTV is receiving 480p and then all of a sudden is receiving 480i? I don't think that is something people have ever really tested, because this scenario (hooking up a line doubler that can pass through 480i when the game switches) has never been done. Sync drops, like what the FM does, might be inherent to scaling processors. Wouldn't it be weird if it was something unique to the FM, which was designed to handle all of these weird resolutions and sync frequencies?So now you get the best of both worlds. The OSSC line doubles the 240p bits of the game and the TV deinterlaces and scales the 480i bits. Both devices working in tandem. Harmoniously, achieving the ideal picture quality from my game.
Wait a minute... would the receiver between the console and TV be handling the upscaling if it's not set to HDMI passthrough?! That might explain why I don't see a difference.
Yeah but what happens, hypothetically, on your TV screen when your HDTV is receiving 480p and then all of a sudden is receiving 480i? I don't think that is something people have ever really tested, because this scenario (hooking up a line doubler that can pass through 480i when the game switches) has never been done. Sync drops, like what the FM does, might be inherent to scaling processors. Wouldn't it be weird if it was something unique to the FM, which was designed to handle all of these weird resolutions and sync frequencies?
That's a good test case though not exactly the same. Still though, isn't it surprising that the FM would have inferior resyncing speed compared to stock HDTV upscalers?Actually, that is tested all the time. When switching a PS2 or GC game to progressive scan. It takes a split second for the sync to re-establish. Nowhere near as bad as switching 240p to 480 in the FM's input. Blink and you'll miss it.
So now that I think about it, manual switching on HDTVs has been standard practice for us retro enthusiasts. lol
So I'm currently at work but I was able to find my TV in Google image search.
EDIT:
So here's where I plugged in the video cables (the white arrow is actually representing a black cable):
As I was marking up the image of the back of the TV I started to realize that I may have plugged the audio into the wrong ports. There's a white audio cable and a red one....where should they be plugged in?
EDIT2:
I think I just realized that I have the black video cable plugged into the wrong spot. It should go into EXT SYNC IN, correct?
Actually, that is tested all the time. When switching a PS2 or GC game to progressive scan. It takes a split second for the sync to re-establish. Nowhere near as bad as switching 240p to 480 in the FM's input. Blink and you'll miss it.
So now that I think about it, manual switching on HDTVs has been standard practice for us retro enthusiasts. lol
What brand is it? I know Onkyos can upscale. They can be toggled to upscale or pass through like my 705. Try hooking up directly for just a quick test. If you see a difference then, see if you can turn off scaling on your receiver.
It certainly wouldn't change from 480i to 480p mid-game since that means millions of people playing Wii on CRTs in mid 2000s would have been staring at black or garbled screens.
Oh, you're totally right! I forgot about that. But does the option even come up if the person is using a 240p/480i display? Purely speculating, but I think there would be a software safeguard in place that would prevent a person from enabling 480p if the TV doesn't support it. It certainly wouldn't change from 480i to 480p mid-game since that means millions of people playing Wii on CRTs in mid 2000s would have been staring at black or garbled screens. That's why I think genius's hypothetical 480p<-->480i example is not worth being concerned since it never happens in the middle of gameplay.
If I'm not mistaken the safeguard was that it would ask "are these settings ok".... as far I recall that is. I remember xbox 360 doing that
PS2 games certainly don't disable 480p options when on a 480i display. I'd never heard of it damaging a display, I'm sure I've tried to run 480p on a 480i crt in the past without thinking about it.
iirc my displays always just gave me black screens while trying to output 480p
http://filthypants.blogspot.com/2014/03/tvs-and-retro-gaming-emulation.htmlOne of the major limiting factors in a CRT is the horizontal scan rate, which is the frequency at which a display can move the electron gun from the left side of the display to the right and back again. CRT monitors, like the kind you would find attached to a crummy old Packard Bell computer, have a high horiz. scan rate of 31 kHz, while NTSC TVs have a comparatively low scan rate of 15 kHz. Furthermore, devices that expect the high scan rate of 31 kHz displays and send a high-resolution signal are not compatible with--and can actually damage--displays with the lower scan rate if connected.
Welcome! Congrats on the pickups too! The hitachi looks...interesting ha. Nice pickup with the PVM.First post on GAF :3... Hi all
Still a great deal. Broken pvm's here in the US go for $100+.