• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Upscalers, CRTs, PVMs & RGB: Retro gaming done right!

Status
Not open for further replies.

Timu

Member
i think the typical numbers are about 1-2 frames. depending on the source.
I have the number 22ms in my head but I don't know if that's right.

Yes.
1 is ok, 2 is too much! 22ms isn't bad but I rather it be 16ms or less. We'll see though.

Props. Your little box does a good job. It has a little quirk though. When the frame freezes on the victory poses, you see combing. A little odd, but not a deal breaker.
Really? I didn't notice that, lol, I thought the character was interlaced.
 
Lag tolerance is very personal. I'm not dumping on the Framemeister; my Kuro panel has 3-4 frames of lag. But when I add 1-2 frames of lag on top of it for up to six frames of total lag, I find myself fucking up constantly in Super Mario Bros.

If I had a panel with less lag I'm sure I'd be fine with the Framemeister lag.

But anyhow, I want to add scanlines to 480p, and the Framemeister sucks at that. Of course many people don't care about this as they have different priorities and needs.
 

D.Lo

Member
But anyhow, I want to add scanlines to 480p, and the Framemeister sucks at that.
No it doesn't, it's pretty much the best at it. It has transparency settings that are fully configurable, unlike pretty much every other option. It's better than an SLG or the Hanzo/Toro.

It sucks at 1080p scanlines, but everything does because 1080 isn't an integer multiple of 240.
 
No it doesn't, it's pretty much the best at it. It has transparency settings that are fully configurable, unlike pretty much every other option. It's better than an SLG or the Hanzo/Toro.

It sucks at 1080p scanlines, but everything does because 1080 isn't an integer multiple of 240.

Everything doesn't suck at outputting 1080p scanlines.

1) Configure Wii U for 1080p output.
2) Put your panel in 1:1 pixel mapping.
3) Run 240p test suite on the vWii in 240p->480p with scanlines mode.

The result is beautiful, evenly spaced scanlines of 240p assets output as 1080p and a perfect drop-shadow test.

So it can be done. If only there were a processor to give these same results with Virtual Console games.
 
Framemeister does perfectly fine scanlines at any resolution if you configure your scaling and the lines properly. If they ain't right, your settings ain't right.

If your talking pre-1.20 or whatever the recent firmware was that overhauled scanline options, though, then why not just output at 720p and let your tv scale? That'll give you pretty much the best scanlines you can get.
 

missile

Member
... There's also sync on green but I think hardly any device uses that. All my consoles are composite sync and luma sync. ...
My PS2(LinuxKit) syncs on green.

...
9ibh1h4.jpg


...bright screens cause this distortion. It's not rolling the way it does if I remove the composite sync cable — this warping effect is static. And it only shows up on bright screens; I tried it with Castlevania IV and the Konami logo screen (which is white) distorted, but the first stage and title screen (which are dark) looked great... except during lightning flashes, where the image would warp again for as long as the screen was white (a couple of frames).

I've spent some time testing different arrangements, and the problem isn't with the SNES Mini, as the N64 suffers from the same problem with this setup. It presumably isn't the RGB21 cable, either, since I don't see this issue if I bypass the PVM and go straight to the upscaler.

So, I'm wondering if anyone has any advice for:

(1) how to fix the RGB21 to component cable, bearing in mind I have no electrical skills whatsoever, or
(2) recommendations on a more effective RGB-modded SNES/N64-to-component solution that doesn't involve having been an HD Retrovision Kickstarter backer.

Any help would be greatly appreciated! Surely someone else out there has an RGB SNES hooked up to a PVM and can steer me right...
Does it happen with any cable connected? I know a failure which produces a
similar behavior, i.e. when increasing the current in the tube (maximal
brightness) which can have some effects on the FBT (Fly Back Transformer)
reducing the voltage needed for horizontal deflection. But I think this isn't
the case here. Looks more like the sync pulse gets shifted for some reason.
 

sp3ctr3

Member
I have a 28" CRT that I use for retrogaming but would really like to save some space and hook old consoles up to my TV with a framemeister.

The thing that is stopping me is my love for lightgun games. Point Blank in particular. Is there any way, a magic box or similar, that let's me use my G-Con45 on a LCD/LED?
 

missile

Member
... I think that's what Timu's device does. Switches from minimal lag weaving in still imagery to motion compensation.
Yeah, that's how it works.

Per scanline (1d), per field (2d), per multiple fields (3d) comb filtering.

How much input lag is on the framemeister?
http://www.neogaf.com/forum/showpost.php?p=188331485&postcount=11840


I'm asking myself if interlacing could appear again in the future given that
LCD displays now try to start to mimic CRTs by using BFI techniques and stuff,
and also given that we get much higher refresh rates which would reduce some
further issues with interlacing. The BFI technique would help the eye to
reduce the sample-and-hold characteristics of common LCD displays which leads
to motion blur for the eye. For, if you blank an illuminated moving spot fast
enough (like a CRT does) then the eye does its job predicting its motion
ahead, which is basically why interlacing works on CRTs, i.e. the eye
extrapolates the motion of the previous field weaving it with the current
field producing an almost proper frame (synced in time). And using a higher
refresh rate could help to reduce some of the annoying interlacing issues,
i.e. interline/edge flickering. With a display refresh rate of >= 2x60Hz
(60Hz NTSC) edge flickering will be greatly reduced. For, with 2x (120Hz) the
edges will flicker at a rate 60Hz, which should be sufficient esp. considering
you won't sit close to the TV. Sure, using 3x (180Hz) the now 90Hz-flickering
fill be gone for humans.

Of course, you wouldn't want all this on the studio-end, but on the consumer-
end with everything adjusted to the perceptional characteristics of humans,
you can gain some saving like cutting the necessary transmission bandwidth in
half, power savings, etc., or you can get double the resolution in time for
the same bandwidth.

Don't know if we will see it again. But there is a trend in adapting all the
backends for humans to their perceptional characteristics now that we know
more about it, like these adapted RGBW displays

rgb-v-rgbw-circle-chart.png



The basics principles on which most of these things rely on is that with an
every increase in resolution etc. the redundancy increases as well, which can
be seen by making a spectral analysis of many devices. For example, for a TV
with a high resolution the lines will look quite similar. Making a spectral
analysis of a b/w video signal reveals that (a) the spectrum is discreet due
to its periodic nature (scanning) and (b) that the multiples of the
fundamental frequency (horizontal frequency) aren't that much modulated due to
the fact that the scanlines look quite similar from line to line. This leaves
a lot of space in the luminance video spectrum which translated into
redundancy. And this was basically the reason why the principle of color
TV (NTSC/PAL composite) video works at all, i.e. it utilizes this redundancy
in the luminances spectrum to merge the color information into these free
slots. Clever! And this wasn't obvious from the beginning as can be seen on
how the RCA struggled to fit the color information into the same 5MHz b/w
video bandwidth in the earlier '50.
 

D.Lo

Member
Everything doesn't suck at outputting 1080p scanlines.

1) Configure Wii U for 1080p output.
2) Put your panel in 1:1 pixel mapping.
3) Run 240p test suite on the vWii in 240p->480p with scanlines mode.

The result is beautiful, evenly spaced scanlines of 240p assets output as 1080p and a perfect drop-shadow test.

So it can be done. If only there were a processor to give these same results with Virtual Console games.
That's not 1080p scanlines, that's 480p with scanlines upscaled post-scanlines to (I assume) 960p. Framemeister (and the Toro and Hanzo and SLG) can do that too if you set it to output at 480p and let your TV scale from 480p.

1080p is not an integer multiple of 240p, 240p cannot be evenly scaled directly to 1080p.
 

Khaz

Member
Does it happen with any cable connected? I know a failure which produces a similar behavior, i.e. when increasing the current in the tube (maximal brightness) which can have some effects on the FBT (Fly Back Transformer) reducing the voltage needed for horizontal deflection. But I think this isn't the case here. Looks more like the sync pulse gets shifted for some reason.

Hey Missile, I have a TV that have a similar problem since forever. Basically, two vertical parallel lines would appear more or less convex, depending on the amount of white in between. When I lower the brightness to the minimum to have a black and white picture, the geometry corrects itself, while the picture itself shrinks to show me overscan all around. Really weird.
 

Galdelico

Member
Hey Missile, I have a TV that have a similar problem since forever. Basically, two vertical parallel lines would appear more or less convex, depending on the amount of white in between. When I lower the brightness to the minimum to have a black and white picture, the geometry corrects itself, while the picture itself shrinks to show me overscan all around. Really weird.

This happens on my 21" Sony Trinitron too. On top of that, if I crank up the contrast, the brighter/more garish the image gets, the wider the picture's H-size becomes.

Oh, and there's one thing that truly boggles my mind (I'm just a curious - yet completely uneducated - animal, about this kind of stuff), that happens with my Japanese SEGA Saturn. Basically, with my previous Trinitron (KV21FT1E 21), I could resize the screen horizontally until the whole frame was pictured with no overscan. I.e., even though CAPCOM games are supposed to look a bit squashed on the Saturn, due to their resolution, I could still see the entire image, if I wanted. Now, with my new CRT (KV21FX30B), no matter how I lower the H-size, the image remais slightly cropped at the sides. With Super Street Fighter II, for example, I get a bit less than half of those winning stars beside the life bars cut off from the screen, and if I reduce the H-size, they are still cut, on a black background.
Other oddities - once again, compared to my older TV - include both the Japanese Saturn and Mega Drive/Mega-CD booting in 16:9 on AV-1 channel, using scart RGB cables (the Japanese PS2 regularly boots in 4:3, on the same channel, in RGB), and the Saturn hi-res games looking way less flickering (I'd say not flickering at all, actually) on the newer Trinitron. It all feels like wizardry to me. :D
 

Ban Puncher

Member
Got my Selecty21 today. Plugged the Neo Geo AES and Super Famicom into it and got picture/sound from the NG but not from the SFC. Removed the Selecty21 from the chain and plugged each system into the Framemeister individually and they worked fine.


Shit.


I then played with the the Sync Level on the Framemeister and changed it from 9 to 14. Now both consoles work fine through the Selecty21.


Phew.
 
Huh. I assued the Framemeister would be like, $500 minus cables. If that's the case, I might definitely go that route. Where's the best place to buy one?
As stated by others Solaris and Amazon are the major sellers. Sometimes a third party seller will pop up that has similar or slightly lowered prices, and you could keep your eyes open for someone selling their used one.
 
Interesting video about restoring and calibrating PVMs

https://www.youtube.com/watch?v=yoBqm4ThJxw&feature=related
I tried asking this on the video comments, but does anyone have advice on how to find techs that competent to restore CRTs? I have a consumer Trinitron that I had worked on by a guy that seemed to know what he was doing, he said he replaced some of the parts known to fail in my model, but the workers in that video seem to be on another level. Are those geometry-fixing magnets available to buy? Are they easy to use? Can you just stick them where the problem in and then go back to adjusting pots? As always this stuff intimidates the hell out of me, and seeing such competent restoration amazes me.
 

Chinner

Banned
I might be able to pick up either a pvm-20m2e or pvm-20m4e. I understand that the m4e is the high end model, but would I be right in assuming the m2e might be preferable as it has 600 lines (as opposed to 800 of the m4e)?? i have got that impression from somewhere...

Any advice on what i should pick?
 

Peltz

Member
Fair enough. Which is the best for RGB gaming?
That is a very subjective question. It depends on how you like retro games to look.

Do you want them super sharp with thick black scan lines in a quality that is unlike anything you've seen before?

Something closer to arcade monitors? (which also contain a wide variety of different specs/looks)

Maybe something which doesn't make scanlines very visible at all?

Different monitors change pretty much everything.
 
I might be able to pick up either a pvm-20m2e or pvm-20m4e. I understand that the m4e is the high end model, but would I be right in assuming the m2e might be preferable as it has 600 lines (as opposed to 800 of the m4e)?? i have got that impression from somewhere...

Any advice on what i should pick?
I would pick based on condition. PVMs don't have an hour count, but there is a date of manufacture on the back, and if you have the opportunity to test them you can see which one has a better picture with less defects.
 

televator

Member
Yeah, that's how it works.

Per scanline (1d), per field (2d), per multiple fields (3d) comb filtering.


http://www.neogaf.com/forum/showpost.php?p=188331485&postcount=11840


I'm asking myself if interlacing could appear again in the future given that
LCD displays now try to start to mimic CRTs by using BFI techniques and stuff,
and also given that we get much higher refresh rates which would reduce some
further issues with interlacing. The BFI technique would help the eye to
reduce the sample-and-hold characteristics of common LCD displays which leads
to motion blur for the eye. For, if you blank an illuminated moving spot fast
enough (like a CRT does) then the eye does its job predicting its motion
ahead, which is basically why interlacing works on CRTs, i.e. the eye
extrapolates the motion of the previous field weaving it with the current
field producing an almost proper frame (synced in time). And using a higher
refresh rate could help to reduce some of the annoying interlacing issues,
i.e. interline/edge flickering. With a display refresh rate of >= 2x60Hz
(60Hz NTSC) edge flickering will be greatly reduced. For, with 2x (120Hz) the
edges will flicker at a rate 60Hz, which should be sufficient esp. considering
you won't sit close to the TV. Sure, using 3x (180Hz) the now 90Hz-flickering
fill be gone for humans.

Of course, you wouldn't want all this on the studio-end, but on the consumer-
end with everything adjusted to the perceptional characteristics of humans,
you can gain some saving like cutting the necessary transmission bandwidth in
half, power savings, etc., or you can get double the resolution in time for
the same bandwidth.

Don't know if we will see it again. But there is a trend in adapting all the
backends for humans to their perceptional characteristics now that we know
more about it, like these adapted RGBW displays

rgb-v-rgbw-circle-chart.png



The basics principles on which most of these things rely on is that with an
every increase in resolution etc. the redundancy increases as well, which can
be seen by making a spectral analysis of many devices. For example, for a TV
with a high resolution the lines will look quite similar. Making a spectral
analysis of a b/w video signal reveals that (a) the spectrum is discreet due
to its periodic nature (scanning) and (b) that the multiples of the
fundamental frequency (horizontal frequency) aren't that much modulated due to
the fact that the scanlines look quite similar from line to line. This leaves
a lot of space in the luminance video spectrum which translated into
redundancy. And this was basically the reason why the principle of color
TV (NTSC/PAL composite) video works at all, i.e. it utilizes this redundancy
in the luminances spectrum to merge the color information into these free
slots. Clever! And this wasn't obvious from the beginning as can be seen on
how the RCA struggled to fit the color information into the same 5MHz b/w
video bandwidth in the earlier '50.

I always wondered what held back a fixed pixel display from simply behaving like a CRT and preserving a 15Khz signal. I mean from my layman's point of view, I think that would be simpler and more power efficient than the processing methods that are currently employed to obtain a progressive image. Sure, fixed pixel tech starts at 31khz, but line multiplication can take care of that. From what I can understand in your post, I guess the issue was image persistence?

That RGBW pixel format looks interesting. I wanna do some reading on it.
 

Mega

Banned
Apart from eBay or gumtree etc does anyone know sites you can search for pvms in the uk?

I don't know of any specifics in your area, but the advice I always give is to look up video production places, recycling centers, liquidators, hospitals, universities, medical reseller sites, etc. Pick up the phone and call or you may never hear back from anyone. Before my first PVM, I got return calls from three places. Also sitting on ebay is not a bad option. Every now and then an incredible deal comes along if you're diligent with your search criteria.

I might be able to pick up either a pvm-20m2e or pvm-20m4e. I understand that the m4e is the high end model, but would I be right in assuming the m2e might be preferable as it has 600 lines (as opposed to 800 of the m4e)?? i have got that impression from somewhere...

Any advice on what i should pick?

Either one is fine and you should go with the one that has a screen in better condition. Assuming normal tube wear, the 20M4 won't be sharp enough to be glaring and dissuade anyone from getting it. Mine was a little soft but it never felt like wow I'm staring at huge distracting scanlines. "Too sharp" becomes a possible factor when you start looking at the BVMs with fresh tubes and dialed-in convergence, and even in that case they look great for 99% of games.

I tried asking this on the video comments, but does anyone have advice on how to find techs that competent to restore CRTs? I have a consumer Trinitron that I had worked on by a guy that seemed to know what he was doing, he said he replaced some of the parts known to fail in my model, but the workers in that video seem to be on another level. Are those geometry-fixing magnets available to buy? Are they easy to use? Can you just stick them where the problem in and then go back to adjusting pots? As always this stuff intimidates the hell out of me, and seeing such competent restoration amazes me.

A couple I sold one of my PVMs does that art installation stuff. I may contact them and see if they know a repair person. I have read that in many cases the people who can fix our monitors are old and retired... or deceased. Then there's the problem of lack of replacement parts to do repairs. I recently drove past a TV repair shop on my way to work, a tiny place with a sign that indicated they had been around since the 60s. Caught a glance of the guy inside and he looked like an old-timer. I wonder if he knows all about this stuff or if he was just a middleman that sent them in for factory repairs.
 
^ Kinda makes me want to learn this stuff so the knowledge doesn't just get lost as generations disappear...
Yes! I have 2 consumer CRTs that I wanted to make into homemade arcade cabs, and I had the more valuable one professionally worked on, but the other one I might take some chances with in order to learn.
 
The Playstation story is pretty well told at this point, the best ways to play the various models games, but what about Xbox? I was planning on eventually getting an OG Xbox and connecting is via component to my Framemeister, but is the 360 backwards compatible like the PS2 is with PS1 games? I'd prefer to minimize my consoles where I can, I use PS2 for PS1 games and Wii for GC games, and the very minor differences or quality loss is totally acceptable for me. But something like PS3 for PS1 games is not acceptable.
 

dubc35

Member
The Playstation story is pretty well told at this point, the best ways to play the various models games, but what about Xbox? I was planning on eventually getting an OG Xbox and connecting is via component to my Framemeister, but is the 360 backwards compatible like the PS2 is with PS1 games? I'd prefer to minimize my consoles where I can, I use PS2 for PS1 games and Wii for GC games, and the very minor differences or quality loss is totally acceptable for me. But something like PS3 for PS1 games is not acceptable.
360's can play some but not all OG Xbox games.

I picked up an OG Xbox a couple weeks ago. I have it hooked up with an official MS component cable that also has digital optical audio out. The system looks and sounds great on my plasma and 5.1 surround system.
 

Peltz

Member
Probably a mixture of scan lines that are noticeable, but accurate picture too.

Middle of the road basically, lol.

I'd shoot for 600 lines or so. But definitely, condition is the most important factor and should trump all other considerations.

Anything higher than 600 lines will still look great, but it may almost look too good and not like things you've seen at the arcade.
 

televator

Member
The Playstation story is pretty well told at this point, the best ways to play the various models games, but what about Xbox? I was planning on eventually getting an OG Xbox and connecting is via component to my Framemeister, but is the 360 backwards compatible like the PS2 is with PS1 games? I'd prefer to minimize my consoles where I can, I use PS2 for PS1 games and Wii for GC games, and the very minor differences or quality loss is totally acceptable for me. But something like PS3 for PS1 games is not acceptable.

Picture quality in the OG Xbox is closely indistinguishable from modern consoles. Some XBOX games run on 360 but you're lucky if they don't display with some graphical glitches and frame rate dips.
 

Peltz

Member
Picture quality in the OG Xbox is closely indistinguishable from modern consoles. Some XBOX games run on 360 but you're lucky if they don't display with some graphical glitches and frame rate dips.
I've seem people in this thread complain about PQ from the original Xbox. Not sure why though.
 

televator

Member
I've seem people in this thread complain about PQ from the original Xbox. Not sure why though.

I've heard that some have bad picture. Different revisions use different video chips. I've been trying to ascertain which models are considered better in this regard, but I can't find any comparison on the web.
 

D.Lo

Member
I've seem people in this thread complain about PQ from the original Xbox. Not sure why though.
There are like 200 revisions of the original Xbox many have a shit DAC. I've read possibly the original batch have a good DAC. Maybe North America got more of the better systems too? I know I have seen two (both hacked PAL systems) run side by side and one looked much better than the other with the same connections.

Also, the PAL console is locked to blurry-ass 576i on an OS level, unlike the Gamecube which can all do 480p fine as long as playing Japanese/US software. I'm still trying to hack one I got recently since my original died, and all the tools require ancient connections and/or tiny USB sticks etc.

In my experience even at its best it's not anywhere near as good as the superb Gamecube component output, but then that's a pretty high bar.
 

televator

Member
This article took a lot more brain power for me than it should have for some reason.

Apparently, in interlaced video, the second chroma value in the YUV sampling standard applies to odd lines... in one field. That really threw me for a loop. The sampling happens at the field level, not the entire frame. It's so much easier to grasp in progressive signals beacause of how straight forward it is. Each field is a frame in entirety so determining chroma and luma resolution is much easier.
 

televator

Member
I've seem people in this thread complain about PQ from the original Xbox. Not sure why though.

Coming back to this conversation a bit. It appears some of the issue with Xbox video stems from the use of a filter in 480i that results in a soft image on a fixed pixel display. This is completely circumvented by simply not using 480i. Go to 31KHz and above and the filter is gone. The filter was there I the first place to reduce flickering.

I was lucky enough to find a seller last year who sold me an Xbox bundled with 5 games and a monster component cable for 25 bucks. That was my first ever Xbox and my experience has been nothing but crisp clean video.

Edit: Further reading reveals that the 2 early DACs used in the Xbox tap the composite signal(?) for SCART output.

Edit 2: my mistake. For RGB Scart the Xbox with those DACs use sync on composite and the bandwidth is reduced to the same level as composite video through the use of a low pass filter. The technicalities of this are beyond my current understanding so I can't quantify its validity.
 

Timu

Member
Coming back to this conversation a bit. It appears some of the issue with Xbox video stems from the use of a filter in 480i that results in a soft image on a fixed pixel display. This is completely circumvented by simply not using 480i. Go to 31KHz and above and the filter is gone. The filter was there I the first place to reduce flickering.

I was lucky enough to find a seller last year who sold me an Xbox bundled with 5 games and a monster component cable for 25 bucks. That was my first ever Xbox and my experience has been nothing but crisp clean video.
Thank god most Xbox games are 480p, hell 50+ of my games are like that, though Agent Under Fire is sadly 480i which is weird on that system.
 

D.Lo

Member
Coming back to this conversation a bit. It appears some of the issue with Xbox video stems from the use of a filter in 480i that results in a soft image on a fixed pixel display. This is completely circumvented by simply not using 480i. Go to 31KHz and above and the filter is gone. The filter was there I the first place to reduce flickering.
actually that makes a lot of sense of my early experiences.

I had a top quality 480i/576i Panasonic, and the Xbox just looked blurry on it. N64 style. Especially for 2D - I was excited to play Street Fighter III online, but went straight back to Dreamcast because it looked like dogshit (also Australian broadband wasn't good enough, it was a lag nightmare). All the while Gamecube (480i and 576i via component) looked fantastic. Especially properly converted PAL games with the actual extra resolution reducing dithering (Metroid Prime 576i on a CRT - knockout). Even PS2 looked better in some games. Even once hacked my only options were 480i and 567i.

There were almost no 480p CRT screens here, as PAL broadcasts never had the flickering issues NTSC did which 480p was a fix for.

Still, years later I have seen two different Xbox consoles at 480p look different too, so the DAC issue is true as well.
 

televator

Member
Thank god most Xbox games are 480p, hell 50+ of my games are like that, though Agent Under Fire is sadly 480i which is weird on that system.

That's a shame for that game, I was looking at picking up. I think it's one of the games that can come into play along the path of soft modding.

Another side note of my investigation so far: Some people have no fucking clue of what they are talking about. lol I read a post complaining about how the Xbox compresses video data to 4:2:2... No fucking shit. Welcome to component video and YUV standards from like 60 years ago.
 

Timu

Member
That's a shame for that game, I was looking at picking up. I think it's one of the games that can come into play along the path of soft modding.

Another side note of my investigation so far: Some people have no fucking clue of what they are talking about. lol I read a post complaining about how the Xbox compresses video data to 4:2:2... No fucking shit. Welcome to component video and YUV standards from like 60 years ago.
Yep.

armarec20160223_004813.png


Had to use my upscaler to get 60FPS and no interlacing in motion. Also this game looks blurrier and more washed out than my 480p games.
 

televator

Member
Still, years later I have seen two different Xbox consoles at 480p look different too, so the DAC issue is true as well.

Was this comparison done under controlled settings that note the DACs in each unit? I have yet to find a straight picture comparison of Conexant Vs Focus Vs Xcalibur.

Yep.

armarec20160223_004813.png


Had to use my upscaler to get 60FPS and no interlacing in motion. Also this game looks blurrier and more washed out than my 480p games.

Jesus Cristo! That is probably the worst combing I have ever seen. It's straight up double vision. lol
 

D.Lo

Member
Was this comparison done under controlled settings that note the DACs in each unit? I have yet to find a straight picture comparison of Conexant Vs Focus Vs Xcalibur.
No, just casually. Plugged both (softmodded for 480p) machines directly into my Panasonic plasma, same cables. Mine looked better than my friend's one. Unfortunately that console died though, I now have another now stuck at 480i/576i and am looking for compatible USB drives to be a fake memory card for a softmod, all the tools to do so are now ancient.
 

televator

Member
No, just casually. Plugged both (softmodded for 480p) machines directly into my Panasonic plasma, same cables. Mine looked better than my friend's one. Unfortunately that console died though, I now have another now stuck at 480i/576i and am looking for compatible USB drives to be a fake memory card for a softmod, all the tools to do so are now ancient.

Is there a size limit on usable USB drives? Is you can figure if the DACs are different and there's a quality difference in progressive video, I'd sure like to see the results. I managed to pick up a second free Xbox (had a faulty, but easily fixable disc drive), but unfortunately it's the same DAC as the other (Conexant).

Hilariously enough it's not deinterlaced at all in this pic!

It's not motion compensated via edge detection. Both fields are there so it's rather sloppily deinterlaced to progressive. I image the camera panned very quickly when you took that shot, hence the disparity in temporal resolution in each field.
 
Status
Not open for further replies.
Top Bottom