• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Upscalers, CRTs, PVMs & RGB: Retro gaming done right!

Status
Not open for further replies.

Peltz

Member
There still is a pause, though. And it can apparently be exacerbated by switches, splitters, displays, capture cards, etc..

Honestly resolution changes are just a bummer in general. There isn't really a good solution that I'm aware of.
It's called a CRT :p
 
I'm not entirely sure how the OSSC works, but it's entirely possible that it can automatically pass through 480i in some optional setting. I used RE2 as an example where that would be a handy feature. Segments in the game that are 240p would be line doubled and segments in the game that are 480i wouldn't be blurred by the OSSC. This way would achieve the best possible image quality.
I guess I don't see why, if your TV can handle 480i well, you would even be using the OSSC or any upscaler. Because if it can handle 480i BETTER than the OSSC it can probably handle 240p just fine. Seems to me if your TV can handle 480i better than an upscaler, it's probably a CRT TV lol.
 

Mega

Banned
Looking at what I'll actually use the PVM for it's best I go with the 20M4U anyways. For 480p+ content I prefer having it scale on a larger modern screen but for 240p I like the CRT look that a PVM/CRT has. Don't think, aside from the novelty, I'd get too much use out of running 480p content on a PVM.

It's more than just novelty. Unless you have good upscaling hardware, 480p Gamecube games and the entire Wii library will look better at native res on a 480p CRT. Even then a good CRT should still have the edge since there won't be scaling artifacts and HDTV lag. The properties of the CRT screen would also do a better job of masking low res textures and lack of AA.

Wii + HDTV is still worth it because many of the games are meant to be played about 6 feet back with Wii remote pointing and gestures... also 4+ multiplayer games. So a 14-20" CRT isn't always the best choice.
 
It's more than just novelty. Unless you have good upscaling hardware, 480p Gamecube games and the entire Wii library will look better at native res on a 480p CRT. Even then a good CRT should still have the edge since there won't be scaling artifacts and HDTV lag. The properties of the CRT screen would also do a better job of masking low res textures and lack of AA.

Wii + HDTV is still worth it because many of the games are meant to be played about 6 feet back with Wii remote pointing and gestures... also 4+ multiplayer games. So a 14-20" CRT isn't always the best choice.

480p looks ALOT better on a CRT. I usually play them on a 21" NEC computer monitor. The only con is i miss out on wide-screen for games that support it.

I keep 27" trinitron for the 4player stuff though. I don't want us all huddling around my 20" bvm or my 21" monitor
 

Bancho

Member
You'd pay £200 for a PVM? Madness.

Though, I know the market has changed since I got my 20M4E for £20. lol

I've noticed the market here in the UK has gone crazy for monitors. Glad I got mine when I did. 14m2e = £10, 20m4e = £30 and my 20" JVC was 99p lol.

Could probably flip them all for substantial profit but I will not get them for that price again!
 
So I posted this in another topic but this is probably more appropriate here. I finally RGB modded my SNES mini and I hooked it up to my Sony PVM with my scart cables but I've encountered an issue. I can get audio and I can get video but not both at the same time. I've been trying to figure out either what I've done wrong or what settings I need to adjust but I haven't been able to figure it out.
 

Peagles

Member
So I posted this in another topic but this is probably more appropriate here. I finally RGB modded my SNES mini and I hooked it up to my Sony PVM with my scart cables but I've encountered an issue. I can get audio and I can get video but not both at the same time. I've been trying to figure out either what I've done wrong or what settings I need to adjust but I haven't been able to figure it out.

Are you able to test the cable with other consoles?

What are you doing differently to get audio only sometimes and video only other times?
 
but its heavvyyyy
Cell phone cameras are thin and can probably easily fit behind the unit without rotating it. That's how I took a photo of my Trinitron's cable input panel, anyway.

I kind of want to break out the service menu to lessen the overscan, though I'm really wary of doing so since it's not like I'd have any idea what I'm doing.
 

televator

Member
I guess I don't see why, if your TV can handle 480i well, you would even be using the OSSC or any upscaler. Because if it can handle 480i BETTER than the OSSC it can probably handle 240p just fine. Seems to me if your TV can handle 480i better than an upscaler, it's probably a CRT TV lol.

What? Da fuck? I can't even... I guess all I can tell you at this point is that I've explained it to you, but in went in one ear and out the other. I feel if I explained it to you again, it would be in vein.
 
Are you able to test the cable with other consoles?

What are you doing differently to get audio only sometimes and video only other times?

So I'm currently at work but I was able to find my TV in Google image search.
lPvHbFh.jpg



EDIT:


So here's where I plugged in the video cables (the white arrow is actually representing a black cable):
5rJzJSO.jpg


As I was marking up the image of the back of the TV I started to realize that I may have plugged the audio into the wrong ports. There's a white audio cable and a red one....where should they be plugged in?


EDIT2:


I think I just realized that I have the black video cable plugged into the wrong spot. It should go into EXT SYNC IN, correct?
 

televator

Member
So I'm currently at work but I was able to find my TV in Google image search.
lPvHbFh.jpg



EDIT:


So here's where I plugged in the video cables (the white arrow is actually representing a black cable):
5rJzJSO.jpg


As I was marking up the image of the back of the TV I started to realize that I may have plugged the audio into the wrong ports. There's a white audio cable and a red one....where should they be plugged in?


EDIT2:


I think I just realized that I have the black video cable plugged into the wrong spot. It should go into EXT SYNC IN, correct?

Preferably, you should be using an external sound setup anyway. I consider it to be an important part of the "retro" experience no matter if you use CRT or an HDTV. If you have an external amp or even a stereo system of some sort, use that instead.
 
It's called a CRT :p
I was thinking in light of timus desire to record. CRTs don't help that unfortunately lol.

Wait, did MLiG say the video out would resolve that problem? Every CRT I knwow of with video out just passes through the incoming video signal, it shouldn't do anything to resolve the capture card's dislike of resolution changes.
 

Khaz

Member
So here's where I plugged in the video cables (the white arrow is actually representing a black cable):

You got your cables all wrong. You're basically switching between Composite video with sound (but you only have CSYNC in your cable so no video) and RGB without SYNC nor sound.

Everything goes into the bottom row. The top row is for encoded signals, Composite, S-Video. Put your red, green blue BNC where it's labelled R, G, and B, the Black BNC into EXT SYNC in, and either the white or red RCA cable into Audio in. Then press EXT SYNC, and depress both A/RGB and LINE/RGB.

I would strongly suggest to invest in an external audio solution, right now you will get shit sound, and only half of it.

It would be interesting for you to try and find a manual online for your monitor. It should make for an interesting read and should explain everything you need to know to make the most of your monitor.
 
Preferably, you should be using an external sound setup anyway. I consider it to be an important part of the "retro" experience no matter if you use CRT or an HDTV. If you have an external amp or even a stereo system of some sort, use that instead.

This. PVMs have shitty mono speakers and only a mono input. Even crappy PC speakers sound 10x better. It's hard to find speakers that can sound as bad as a PVM. Even the Amazon basics speakers would be a big upgrade. You would need an adapter like this for most of these kinds of powered speakers. That's the low end, from there on up you have many other options like using a receiver and speakers, or a bookshelf stereo system, etc.
 

Peagles

Member
You got your cables all wrong. You're basically switching between Composite video with sound (but you only have CSYNC in your cable so no video) and RGB without SYNC nor sound.

Everything goes into the bottom row. The top row is for encoded signals, Composite, S-Video. Put your red, green blue BNC where it's labelled R, G, and B, the Black BNC into EXT SYNC in, and either the white or red RCA cable into Audio in. Then press EXT SYNC, and depress both A/RGB and LINE/RGB.

I would strongly suggest to invest in an external audio solution, right now you will get shit sound, and only half of it.

It would be interesting for you to try and find a manual online for your monitor. It should make for an interesting read and should explain everything you need to know to make the most of your monitor.

I concur.
 
It's more than just novelty. Unless you have good upscaling hardware, 480p Gamecube games and the entire Wii library will look better at native res on a 480p CRT. Even then a good CRT should still have the edge since there won't be scaling artifacts and HDTV lag. The properties of the CRT screen would also do a better job of masking low res textures and lack of AA.

Shhh, trying to ease the "loss" of that 20L5 over here! Just going off how 480i looks with the Gamecube on my PVM there is definitely an appeal to how it looks on a PVM but I still prefer having the larger size and ease of use that comes with my current setup.

Also, I've already committed my Gamecube to an HDMI mod so I'm not about to run off and mod another one for analogue video; especially not when there are so many other 240p consoles I have to update for RGB now!
 

televator

Member
It's more than just novelty. Unless you have good upscaling hardware, 480p Gamecube games and the entire Wii library will look better at native res on a 480p CRT. Even then a good CRT should still have the edge since there won't be scaling artifacts and HDTV lag. The properties of the CRT screen would also do a better job of masking low res textures and lack of AA.

Wii + HDTV is still worth it because many of the games are meant to be played about 6 feet back with Wii remote pointing and gestures... also 4+ multiplayer games. So a 14-20" CRT isn't always the best choice.

I've played a good chunk of Wii games and I'd say there's no real difference in playing a Wii game or GC game on an HDTV aside from a few exceptions like Mario Galaxy... Also aside from the softer Wii picture, of course... Actually it could be argued the other way around that the GameCube looks better in general than the Wii on an HDTV because of that. Heh

Also, a good HDTV won't present artifacts on 480p or even 480i material. So if we're talking ideal CRT then it's fair to bring up ideal HDTV. The input lag still remains a huge disadvantage on HDTVs though. Bad textures and lack of AA, yeah, it doesn't bother me much to see them unmasked, but other people's preference can be different. What really does bother me is the visible heavy handed dithering that's fully exposed on an HDTV. Its a big reason as to why I'm looking to emulation as a preferred solution for some games. That much I do have to admit.
 

Mega

Banned
I don't have a dedicated upscaler to compare, but my Wii games really do look better on Wii + 480p CRT than Wii U + 1080p Panasonic plasma. It's a pretty big difference where the former looks clean and sharp, basically as perfect as can be outside of emulation. The latter looks kind of blurry and less appealing for lack of a better description. I don't mean on a personal taste level but an observable degraded picture that you can actually point to as having problems. This is where I suppose an upscaler would bring the comparison closer, but the TV on its own is certainly worse.

Maybe this weekend I'll take pictures and see if the difference can be captured by my camera. I got a tripod a couple weeks ago allowing me to use the lowest ISO settings (higher destroys fine photo details), so my recent photos have been as sharp, detailed and camera shake-free as possible.
 
What? Da fuck? I can't even... I guess all I can tell you at this point is that I've explained it to you, but in went in one ear and out the other. I feel if I explained it to you again, it would be in vein.
I've been thinking about it some more and I think I figured out what you were saying. That it would be a work around, while temporarily losing some picture quality, from that sync drop in game resolution switches. Maybe the TV would still drop sync though, going from a 480p to 480i input on the fly?
 

televator

Member
I don't have a dedicated upscaler to compare, but my Wii games really do look better on Wii + 480p CRT than Wii U + 1080p Panasonic plasma. It's a pretty big difference where the former looks clean and sharp, basically as perfect as can be outside of emulation. The latter looks kind of blurry and less appealing for lack of a better description. I don't mean on a personal taste level but an observable degraded picture that you can actually point to as having problems. This is where I suppose an upscaler would bring the comparison closer, but the TV on its own is certainly worse.

Maybe this weekend I'll take pictures and see if the difference can be captured by my camera. I got a tripod a couple weeks ago allowing me to use the lowest ISO settings (higher destroys fine photo details), so my recent photos have been as sharp, detailed and camera shake-free as possible.

Is the WiiU set to 1080p? Because yeah, the WiiU scaling sucks. My Panasonic looks clearly better than both the Framemeister and WiiU @ 1080p. Neither my Sony CRT nor the XM 29 I can say "look better" than it. My manic notice of geometry quirks aside... I Should also mention that I don't really bother setting the plasma to "game mode". I let it do its processes.

I've been thinking about it some more and I think I figured out what you were saying. That it would be a work around, while temporarily losing some picture quality, from that sync drop in game resolution switches. Maybe the TV would still drop sync though, going from a 480p to 480i input on the fly?

I don't know why I'm doing this but okay... Back to square one. I believe your question was: What is the point of a 480i passthrough on this line doubler? I'll assume that is the correct question. You can tell me if its not latter.

Now again I'll start off by explaining how a line doubler works. In particular the OSSC. The OSSC is a strict line doubler/tripler. It does not "scale" in the normal sense, nor does it actually deinterlace 480i... Because it's a line doubler/tripler. What it does to 480i is essentially the same thing it does to 240p. This function is fantastic for 240p because it maintains the sharpens and information of the original picture by just duplicating every line from a 240p frame into a 480p frame. Slap on some Scanlines and you have somewhat of an aproximation to CRT quality. This is clearly some thing most HDTVs can't even come close to with their internal scalers that try to deinterlace 240p. With 480i however, line doubling despite being lag free, looks pretty bad.

Some people can live with line doubled 480i as a trade off for lag free processing. Others, like myself, cannot live with that at all and just so happen to own a HDTV that does 480i deinterlacing quite brilliantly... BUT our TVs either don't accept 240p or the damned thing tries to deinterlace 240p. So the OSSC can be employed to handle 240p instead of the TV... but like I said, my TV does a better job with 480i than any line doubler ever could. So what happens in a game that has variable resolutions? Do I just live with the crappy image quality of line doubled 480i in the certain areas of the game? Well it's not like I have a choice because my TV can't handle 240p at all... Except, now, the OSSC has a pass through function for 480i that can potentially fix that. This tells the OSSC to not process 480i at all, and kick the can down to the HDTV instead. It, in my theory, could feed interlaced picture to the HDTV when a game changes from 240p to 480i. There might be a small pause when the resolution switch occurs, but from the accounts I've read, it's nowhere near as bad as the Framemeister. So now you get the best of both worlds. The OSSC line doubles the 240p bits of the game and the TV deinterlaces and scales the 480i bits. Both devices working in tandem. Harmoniously, achieving the ideal picture quality from my game.
 

Mega

Banned
On my plasma set I can't see a difference in Wii games whether they're running on Wii U 480p or 1080p settings. Repeatedly went back and forth the day you told me the plasma has better scaling capability than the Wii U, but I just couldn't see it. Not set to game mode either. I have recommended custom settings from a AVS forum thread about my TV.

The difference between your results and mine may be that my Panasonic is a year older model and possibly has a worse internal scaler than the following year's upgrade.
 

televator

Member
On my plasma set I can't see a difference in Wii games whether they're running on Wii U 480p or 1080p settings. Repeatedly went back and forth the day you told me the plasma has better scaling capability than the Wii U, but I just couldn't see it. Not set to game mode either. I have recommended custom settings from a AVS forum thread about my TV.

The difference between your results and mine may be that my Panasonic is a year older model and possibly has a worse internal scaler than the following year's upgrade.

It could be, though it's a little hard to think it could be such a stark difference in just one year... ... ... Then again... the lag introduced by my Panasonic's processes increased quite a bit over yours. It is entirely possible that my TV has more picture massaging computing going on. Huh

I can even describe the difference between the WiiU upscale and my plasma. The WiiU upscale clearly exhibits edge artifacts. They're clear as day. Excessive stair stepping that exacerbates aliasing and even introduces erroneous coloration on some edges. My ST60 on the other hand... well if I didn't know any better I'd swear it looks like the damned thing is running 480p natively. No sharpness or softeness due to filtering, edges look exactly how an unaliased game looks on my PC. That is to say "natural" or "default." No excess artifacting. It was a completely stunning moment when I compared it to the Framemeister and WiiU and realized how far and away my Panasonic was.
 

Mega

Banned
Wait a minute... would the receiver between the console and TV be handling the upscaling if it's not set to HDMI passthrough?! That might explain why I don't see a difference.
 
So now you get the best of both worlds. The OSSC line doubles the 240p bits of the game and the TV deinterlaces and scales the 480i bits. Both devices working in tandem. Harmoniously, achieving the ideal picture quality from my game.
Yeah but what happens, hypothetically, on your TV screen when your HDTV is receiving 480p and then all of a sudden is receiving 480i? I don't think that is something people have ever really tested, because this scenario (hooking up a line doubler that can pass through 480i when the game switches) has never been done. Sync drops, like what the FM does, might be inherent to scaling processors. Wouldn't it be weird if it was something unique to the FM, which was designed to handle all of these weird resolutions and sync frequencies?
 

televator

Member
Wait a minute... would the receiver between the console and TV be handling the upscaling if it's not set to HDMI passthrough?! That might explain why I don't see a difference.

What brand is it? I know Onkyos can upscale. They can be toggled to upscale or pass through like my 705. Try hooking up directly for just a quick test. If you see a difference then, see if you can turn off scaling on your receiver.

Yeah but what happens, hypothetically, on your TV screen when your HDTV is receiving 480p and then all of a sudden is receiving 480i? I don't think that is something people have ever really tested, because this scenario (hooking up a line doubler that can pass through 480i when the game switches) has never been done. Sync drops, like what the FM does, might be inherent to scaling processors. Wouldn't it be weird if it was something unique to the FM, which was designed to handle all of these weird resolutions and sync frequencies?

It might happen, that's a good question. Though I don't expect it to be as bad as when you switch resolutions on the Framemeister. It can take what feels like 5 seconds going 480p to 720p on the FM for example.
 

Mega

Banned
^Responding to geniusbits: Televator already covered that exact scenario in his explanatory post. The upscaler line doubles 240p to 480p so the TV can display it. If the game switches to 480i, the TV takes over if 480i passthrough is enabled.

If you mean a native 480p source switching to 480i, that would never happen because they're different scanning frequencies and no game to my knowledge was made to do such a thing, if it's even possible. A Wii game for example is 480i or 480p all the way through depending on the display. A 480i game that forced 480p at some point would be pulled from shelves and lead to lawsuits for damaging SD televisions. I know at Windows bootup, CRT Emudriver shows a VGA resolution before switching to a SD resolution at login. During that time it's recommended you keep an SD display off or risk damaging the tube.
 

televator

Member
Actually, that is tested all the time. When switching a PS2 or GC game to progressive scan. It takes a split second for the sync to re-establish. Nowhere near as bad as switching 240p to 480 in the FM's input. Blink and you'll miss it.

So now that I think about it, manual switching on HDTVs has been standard practice for us retro enthusiasts. lol
 
Actually, that is tested all the time. When switching a PS2 or GC game to progressive scan. It takes a split second for the sync to re-establish. Nowhere near as bad as switching 240p to 480 in the FM's input. Blink and you'll miss it.

So now that I think about it, manual switching on HDTVs has been standard practice for us retro enthusiasts. lol
That's a good test case though not exactly the same. Still though, isn't it surprising that the FM would have inferior resyncing speed compared to stock HDTV upscalers?
 
So I'm currently at work but I was able to find my TV in Google image search.
lPvHbFh.jpg



EDIT:


So here's where I plugged in the video cables (the white arrow is actually representing a black cable):
5rJzJSO.jpg


As I was marking up the image of the back of the TV I started to realize that I may have plugged the audio into the wrong ports. There's a white audio cable and a red one....where should they be plugged in?


EDIT2:


I think I just realized that I have the black video cable plugged into the wrong spot. It should go into EXT SYNC IN, correct?


Looks like you got my PVM monitor or a variation of it. I have the sony pvm-20m2u and it looks exactly like that.

Also I use external audio so I can't really help on the audio front. I would personally even use some cheap PC speakers and get

a) 3.5mm Male to 2RCA Female cable
b) 3.5mm Female to Female coupler

If you can get some external powered speakers that would be even better.
 

Mega

Banned
Actually, that is tested all the time. When switching a PS2 or GC game to progressive scan. It takes a split second for the sync to re-establish. Nowhere near as bad as switching 240p to 480 in the FM's input. Blink and you'll miss it.

So now that I think about it, manual switching on HDTVs has been standard practice for us retro enthusiasts. lol

Oh, you're totally right! I forgot about that. But does the option even come up if the person is using a 240p/480i display? Purely speculating, but I think there would be a software safeguard in place that would prevent a person from enabling 480p if the TV doesn't support it. It certainly wouldn't change from 480i to 480p mid-game since that means millions of people playing Wii on CRTs in mid 2000s would have been staring at black or garbled screens. That's why I think genius's hypothetical 480p<-->480i example is not worth being concerned since it never happens in the middle of gameplay.

What brand is it? I know Onkyos can upscale. They can be toggled to upscale or pass through like my 705. Try hooking up directly for just a quick test. If you see a difference then, see if you can turn off scaling on your receiver.

It's a Yamaha RX-S600. I tried enabling standby passthrough, which I don't think is the same as actual video passthrough. With standby, I can turn off the receiver and still show a picture: the screen switches off for a second before it resumes displaying the game. From eyeballing and looking at zoomed in photos on my phone, I can't see any difference in Wii games between:

-Wii U at 1080p
-Wii U at 480p with receiver on
-Wii U at 480p with receiver off and passthrough enabled


I think the receiver does not handle upscaling. There's a receiver menu option that shows signal properties.

-If Wii U is set to 1080p, it says video in - 1080p, video out - 1080p as expected, the console is handling upscaling.
-If I set the system to 480p, the receiver says video in - 480p, video out - 480p... indicating it's not doing upscaling.

In all cases, it all looks the same. I did this all in the morning before work... I'll try connecting the console directly to the TV to see if it looks any different.
 
PS2 games certainly don't disable 480p options when on a 480i display. I'd never heard of it damaging a display, I'm sure I've tried to run 480p on a 480i crt in the past without thinking about it.

iirc my displays always just gave me black screens while trying to output 480p
 

televator

Member
Oh, you're totally right! I forgot about that. But does the option even come up if the person is using a 240p/480i display? Purely speculating, but I think there would be a software safeguard in place that would prevent a person from enabling 480p if the TV doesn't support it. It certainly wouldn't change from 480i to 480p mid-game since that means millions of people playing Wii on CRTs in mid 2000s would have been staring at black or garbled screens. That's why I think genius's hypothetical 480p<-->480i example is not worth being concerned since it never happens in the middle of gameplay.

Well on GC, there are just 2 pins on the component cable that it detects for the 480p option to come up. In our example we are assuming 480p is coming from the line doubling process of a 240p source. There are games that switch on the fly between 240p and 480i. Hence, our conversation around 480p <--> 480i
 

Mega

Banned
PS2 games certainly don't disable 480p options when on a 480i display. I'd never heard of it damaging a display, I'm sure I've tried to run 480p on a 480i crt in the past without thinking about it.

iirc my displays always just gave me black screens while trying to output 480p

Most go to a black screen which I think is a safety feature built into the circuitry to shut off or idle the screen. Some CRTs do not do this (older ones I think). A couple of my Sony monitors, for example, will try to display a signal past their limits and it's not pretty. The screen is a doubled mess and sometimes there's a disturbing pitched whine. This is just for 480p. I wouldn't try to find out what happens for 720p or higher. I read on several occasions that doing this repeatedly or for a prolonged time will stress the tube and can cause permanent damage. I'm not trying to find out if this is true!

Basically the same point as above:

One of the major limiting factors in a CRT is the horizontal scan rate, which is the frequency at which a display can move the electron gun from the left side of the display to the right and back again. CRT monitors, like the kind you would find attached to a crummy old Packard Bell computer, have a high horiz. scan rate of 31 kHz, while NTSC TVs have a comparatively low scan rate of 15 kHz. Furthermore, devices that expect the high scan rate of 31 kHz displays and send a high-resolution signal are not compatible with--and can actually damage--displays with the lower scan rate if connected.
http://filthypants.blogspot.com/2014/03/tvs-and-retro-gaming-emulation.html
 
I set GT4 to 480p the other day on my M4E and it seemed to show two images next to each other, was weird. Not playable, obviously.

1080i was a mental snow pattern, worth a try!
 
I remember people talking about 480p being sent to a 15khz display causing doubling. Is it the whole image? I would think it would have a half the resolution but doubled because it's being told to skip down faster than it can write, so half of the lines would be on the left image and half on the right.
 

Heaps

Banned
First post on GAF :3... Hi all...

Literally just picked up this bad boy for free off Craigslist.

20" Hitachi CT2000W TV






Love that woodgrain look, love the external speaker outputs, its definately got the a e s t h e t i c I love.

Also about a week ago, I picked up this Sony PVM..

 

dubc35

Member
First post on GAF :3... Hi all
Welcome! Congrats on the pickups too! The hitachi looks...interesting ha. Nice pickup with the PVM.

I should be picking up a CRT today. 21" Sanyo with component input. It's no Trinitron but I'm not convinced I have space for one anyway. It's $5 so not a big deal. We'll see!
 
First post on GAF :3... Hi all...

Literally just picked up this bad boy for free off Craigslist.

20" Hitachi CT2000W TV






Love that woodgrain look, love the external speaker outputs, its definately got the a e s t h e t i c I love.

Also about a week ago, I picked up this Sony PVM..


Welcome! Nice gets.
 

Rich!

Member
Still a great deal. Broken pvm's here in the US go for $100+.

Yeah but $100 is a lot less than £100...

I should just be perfectly happy with my 750 line 17" JVC PVM. It's utterly fantastic...I'm just never satisfied.

edit: wait, broken? oh man, you guys in the US have it tough!
 
Status
Not open for further replies.
Top Bottom