• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Upscalers, CRTs, PVMs & RGB: Retro gaming done right!

Status
Not open for further replies.

Mega

Banned
haven't kept up since plasmas kinda died out - are we headed into that, lagless input/CRT filters and all? is there any indication so far?

There are monitors now with half a frame of lag, image processing advances may improve on that and it's getting to a point where gamers don't notice lag with these monitors. Full array LED and OLED match/exceed the contrast and black levels of CRT. Pixel transition improvement has long eliminated traditional blur/ghosting. Low persistence/strobing eliminates motion blur caused by the combination of our constant eye tracking + sample-and-hold displays. And I keep reading that higher resolutions = better emulation filters.

It'll all come together. I hope.
 

IrishNinja

Member
There are monitors now with half a frame of lag, image processing advances may improve on that and it's getting to a point where gamers don't notice lag with these monitors. Full array LED and OLED match/exceed the contrast and black levels of CRT. Pixel transition improvement has long eliminated traditional blur/ghosting. Low persistence/strobing eliminates motion blur caused by the combination of our constant eye tracking + sample-and-hold displays. And I keep reading that higher resolutions = better emulation filters.

It'll all come together. I hope.

man, me too - whenever it's this plasma's time to go, i'm hoping for a more ideal solution from a big screen HDTV, and all of that in one set would be amazing
 

Sixfortyfive

He who pursues two rabbits gets two rabbits.
Competent computer monitors have had ~3ms input lag or lower for years now. If that's your major concern then it's not hard to find a satisfactory display these days.

In theory, it's actually feasible for flat-screens to beat out CRTs for input lag in the future, but only if the refresh rate is extremely high while also keeping the processing delay to a minimum. I don't expect TV manufacturers to ever make it a priority, though.

Maybe they'll release one that can scale up to 4K, for all those fools adopting tech too early.

XRGB-4K has a nice ring to it.

Not much point in that, since any 1080p signal should scale up to 2160p pretty cleanly on any 4K set.

They "should," but I don't think it's wise to assume that TV manufacturers will implement competent scaling engines for video games, no matter how simple the feat may be.

Considering the shelf life of their previous scalers (the XRGB-3 is still available for purchase brand new), Micomsoft would be foolish to not target 4K resolution. The whole point of their devices is to match old video sources to modern displays, so they might as well aim for an actually modern resolution.
 

Mega

Banned
Competent computer monitors have had ~3ms input lag or lower for years now. If that's your major concern then it's not hard to find a satisfactory display these days.

In theory, it's actually feasible for flat-screens to beat out CRTs for input lag in the future, but only if the refresh rate is extremely high while also keeping the processing delay to a minimum. I don't expect TV manufacturers to ever make it a priority, though.

Oh, that's true. I was only thinking of HDTV input lag numbers and forgot monitors have even better.

How fast of a refresh rate? I think we're now seeing monitors with 144hz refresh. I'm not betting on TV manufacturers bringing us what we need in consumer sets, same as BVM/PVM beat the pants off any old CRT TV. It'll be gaming monitors, from the likes of Asus, that nail every spec.
 
I was a little disappointed that the FM only did 1080p, and not the 1920x1200 that the 3 did over DVI.

I mean, 240p is an integer of 1200 - that makes wayyyyy more sense for scaling than crummy 1080.
 

Sixfortyfive

He who pursues two rabbits gets two rabbits.
How fast of a refresh rate? I think we're now seeing monitors with 144hz refresh. I'm not betting on TV manufacturers bringing us what we need in consumer sets, same as BVM/PVM beat the pants off any old CRT TV. It'll be gaming monitors, from the likes of Asus, that nail every spec.

The theoretical minimum display lag (assuming that the display in question has no processing lag at all) is halved every time the refresh rate is doubled. At 60 Hz, each frame takes ~17ms to finish drawing, while at 120 Hz, each frame takes ~8ms. You can see it in action in this comparison:

lag_dell_e773c_vga_1080p_hdfury2_xcapture1.png
lag_sony_kdl65w950b_hdmi_1080p_backlight10.png


Both displays start drawing their picture at the top line and then fill in each successive line in sequence, ending at the bottom, after which they go back to the top to start drawing the next frame. The three white bars on the left of the picture are used to measure the display lag (defined as the time elapsed between the moment when the current frame was sent from the tester's HDMI-out to the moment at which the tester's photosensor detects the white bar on the screen) at each of those 3 points of the screen: top, middle, and bottom.

The CRT on the left is running at 60 Hz. The LCD on the right is running at 120 Hz. You can see that the CRT "starts off" significantly faster than the LCD (1ms vs 13ms), which suggests that the LCD is a fair bit slower at processing the image. But take note of the differences between the top sensor and the bottom sensor: 14ms for the CRT vs 7ms for the LCD. This suggests that the LCD "catches up" a little bit by the time it's done drawing its current frame because its refresh rate is twice as fast. Increase the refresh rate by a few more factors and the display could basically finish drawing the picture instantaneously, beating out the CRT.

That's assuming that the display always draws the picture at its refresh rate, ignoring the actual frame rate of the video source. I'm not 100% sure that's always the case, but it looks like it's the case in the example above. (The lag tester being used is a 60fps video source, but the LCD still seems to be drawing at 120 Hz.)

As an aside, this testing method is also why a lot of recent HDTV lag measurements tend to indicate that displays have more lag than previously reported by old testing methods like the camera + CRT method. The camera method doesn't take the refresh rate into account and treats the CRT's inherent lag as "zero."
 

Khaz

Member
In theory, it's actually feasible for flat-screens to beat out CRTs for input lag in the future, but only if the refresh rate is extremely high while also keeping the processing delay to a minimum. I don't expect TV manufacturers to ever make it a priority, though.

lighting the display as the data is being sent is pretty much as fast a display can be though, you can't beat the arrow of time.

Flat-panels have an inherent input lag because of how they operate. They need a full frame in their buffer before changing all the pixels at the same time. It can be almost as good, but nothing beats printing scan line after scan line, pixel after pixel as soon as it's received without buffer. An ideal flat-panel would print the whole image as soon as it receives the last bottom pixel and thus have a negligible input lag, considering how the console and the user work (reacting to a pixel so that the pixel in the following frame is changed). It would still allow for a single frame of lag in very specific situations though.

If they were to change the technology to have an asynchronous pixel switch it would allow real-time, buffer-less picture printing. With higher native res, the display could be designed to light a bunch of LEDs per bit sent and allow for a raw upscaling. I doubt this sort of technology would come to us consumers, if it's even ever invented. I'm not familiar with how HDMI works, so this sort of technology could even be pointless in the digital age, only useful for us analogue weirdos with light guns.
 

Galdelico

Member
As long as the converter handles 240p correctly, those visual effects should be preserved.

If they don't look right, that's usually a tell-tale sign that the converter/display is incorrectly treating 240p as if it was 480i. A lot of old games have visual effects that alternate on and off (or "flicker," as you said) for several frames in a row at 60fps. If the device mistakes 240p for 480i, then neighboring frames either get blended together or dropped entirely and it just looks wrong.
Thanks alot!
So is it due to screen resolution and not refresh rate? I was under this impression, because those effects get displayed correctly on my CRT TV, no matter if Street Fighter Zero 2 - example related to those videos I posted above - runs on the SEGA Saturn, displayed at a native 240p resolution, or upscaled to 480i on the PlayStation 2 (from the Street Fighter Zero collection). On the other hand, both versions look messed up - just like in the YouTube vids - on my LCD TVs.
 

Khaz

Member
The theoretical minimum display lag (assuming that the display in question has no processing lag at all) is halved every time the refresh rate is doubled. At 60 Hz, each frame takes ~17ms to finish drawing, while at 120 Hz, each frame takes ~8ms. You can see it in action in this comparison:

If I understand correctly, faster-than-frame displays refresh half of the frame every time instead of the full frame,while making the entire screen blink. Something like:

time:....................0ms..............................8ms..............................16ms..............................24ms..............................32ms...........
analogue input:...frame 1, line 1...........................frame 1, line 240.|.frame 2, line 1........................frame 2, line 240.| frame 3...
120Hz frame:......nothing................top 1, bottom 0................top 1 bottom 1................top 2, bottom 1...............top 2 bottom 2..
60Hz frame:........nothing............................................................frame 1...................................................................frame 2........

(Hopefully the table won't be messed up, sorry, phone users)

In this scenario, in order to get a scanline-fast refresh speed, you need to have a refresh of 15kHz, or 11MHz to go pixel-fast. Fastest current models are 144Hz. I really don't think the current technology will go there, you need a change in paradigm.
 

Khaz

Member
The previous picture I posted shows an LCD that doesn't change all the pixels at the same time.

In my understanding, they do. Everything is updated twice as fast with 120Hz displays: the buffer updates half-frames, and the LEDs blink twice as fast, except half of the time it's to display the same thing.
 

Sixfortyfive

He who pursues two rabbits gets two rabbits.
In my understanding, they do. Everything is updated twice as fast with 120Hz displays: the buffer updates half-frames, and the LEDs blink twice as fast, except half of the time it's to display the same thing.

Your understanding is incorrect. If the HDTV in the above picture always updated the entire picture at once, then all 3 bars would be showing identical values. (Read what I posted. If you understand how the tester works then this should make sense.)

This is the case for some flat-screen displays. The documentation that came with the tester said that it is common for plasma TVs to push every pixel to the screen at once (resulting in identical readings on all 3 bars), but LCDs tend to mimic CRT raster scan behavior and draw in everything line-by-line.

You might be correct in saying that an LCD has to receive an entire frame before it can start drawing the picture... but does it take 16.66ms to send a full frame? What if the video output of the game console (or other source) is something like this:

timing.png


(I'm not saying this is necessarily the case; this is just my assumption, because I can't think of anything else that would make sense after accounting for the display behavior that I do know. Also: woo MSPaint!)

Anyway, if you think of the timing for the video source in terms of that graph, then you could see why it would only take an LCD 2 or 3 milliseconds or so to start drawing a picture. Then this test (on a 60 Hz LCD monitor) would make sense:

lag_asus_vh236h_hdmi_1080p.png


It would be possible that it takes about 2ms to 3ms fill the LCD's frame buffer, and then the LCD starts drawing the picture almost immediately, ending about 14ms later (3ms for the top bar; 17ms for the bottom bar). A 120 Hz LCD with the same buffer and processing speed could finish drawing the picture in 7ms, giving 3ms for the top bar but 10 ms for the bottom bar.

In theory. I'm honestly not confident in how a frame buffer works in this context, but it makes sense to me that a complete frame can be sent from a video source at a speed faster than the video's frame rate.

I'd actually love to learn more about this. The nature of how the timing works between source and display is something I really want to read more about. I once heard Kevtris say that the NESHDMI has an "8 scanline buffer" that results in near-zero lag, implying that HDMI displays don't have to wait on a full frame before they start drawing, but I'm not sure if I understood that correctly.

Thanks alot!
So is it due to screen resolution and not refresh rate? I was under this impression, because those effects get displayed correctly on my CRT TV, no matter if Street Fighter Zero 2 - example related to those videos I posted above - runs on the SEGA Saturn, displayed at a native 240p resolution, or upscaled to 480i on the PlayStation 2 (from the Street Fighter Zero collection). On the other hand, both versions look messed up - just like in the YouTube vids - on my LCD TVs.

Your TV probably has a bad deinterlacing algorithm and also probably treats 240p as if it was interlaced.
 

Khaz

Member
You might be correct in saying that an LCD has to receive an entire frame before it can start drawing the picture... but who says that the wait time has to be 16.67 milliseconds? What if the video output of the game console (or other source) is something like this:

timing.png

This I'm confident isn't what happen with analogue signals and consoles. Each line pixel is computed and sent in real time by the VDP to be displayed also in real time by a buffer-less CRT display. Depending of how competent the VDP is, the horizontal resolution improves and the ability to have thinner objects on screen. Each pixel data sent is converted instantly as an electron burst in the never-stopping, always-scanning electron beam. In fact an input can be taken mid-frame, to modify only the lower subsequent part of the image. It's most easily understood by reading about how early consoles like the Atari 2600 display their frame, or "chasing the beam", but all the analogue consoles behaved the same way (with additional and asynchronous off-screen and on-screen computing depending on the generation)

With Digital video output and computers I'm less confident but I think is more like what you drew. I know the frame can be calculated very fast, leading to crazy benchmarks numbers like the video card calculating like 200 frames per second, but only actually sending 60 of them due to the limitations of the display. In this case, an augmentation in display framerate, and the new adaptive framerate technology that's all the rage in PC gaming is highly beneficial. But this is a different world than our old consoles.
 

Sixfortyfive

He who pursues two rabbits gets two rabbits.
With Digital video output and computers I'm less confident but I think is more like what you drew. I know the frame can be calculated very fast, leading to crazy benchmarks numbers like the video card calculating like 200 frames per second, but only actually sending 60 of them due to the limitations of the display. In this case, an augmentation in display framerate, and the new adaptive framerate technology that's all the rage in PC gaming is highly beneficial. But this is a different world than our old consoles.

Well I'm talking mostly in terms of digital video and HDMI (or other digital formats). In this realm, I think it's feasible that my earlier post--that LCDs could eventually outperform CRTs in display lag--is true... for digital video sources.

When it comes to converting analog sources to digital displays, I'm not so sure.

EDIT: Actually... I should just do a camera test of my own. I could split the 240p Test Suite's input lag timer to a CRT and my LCDs. If the difference is less than 16ms then there's no reason to believe than an LCD has to wait for more than a frame before it does anything.
 

Khaz

Member
Yeah. I'll just disagree with the "outperform", as it would be difficult to beat a speeding, unbuffered electron. But made equal in an all digital world? probably, once we hit 11MHz display rate. And at that point, if using computer simulations of our old consoles, we could have a flawless identical, all-digital recreation of our old media. But I suspect it will be in a long, looong time.
 

D.Lo

Member
The CRT is an absolutely crazy insane ahead of its time technology.

I mean really, capturing video at a set resolution via an analogue method, turning it into an electrical signal, then reconfiguring it by firing electron beams onto a phosphorescent screen?

And this shit was done in 1907?

And a commercially released TV came out in 1934?

What. The. Fuck.

Personally I think it's got to be one of the most amazing things humans have ever achieved for the time it was done. Mind boggling that it is over 100 years old.
 

missile

Member
The CRT is an absolutely crazy insane ahead of its time technology. ...
That's for sure. It's really insane what goes on in a tube when actually
displaying a picture on the screen. The CRT literally combines all of the 19th
and 20th century of science in a bulb of glass. Dealing with CRTs is playing
with nature right there. I think people were stunt seeing an all electronic
video transmission from one tube to another (realized by Manfred v. Ardenne)
at the Deutsche Funkausstellung in Berlin 1930 where Albert Einstein held the
opening speech.

That's also why I think that one day a tube will be build from scratch again
with contemporary technology. It might just be an university project
whatsoever, but it will be done simply because a CRT is a masterpiece of
science, an icon combining the roots of all of electronics in one piece.

I love digital, but analog is seeing nature at its best.
 
also, i'm worried that they'll announce a new upscaler that's gonna kill the resale value since it's been a while and it seems likely that a new one could drop anytime from now.
The Mini didn't kill the resale value of the XRGB3, people were still selling them second hand for $300+ for years after it came out.
What's weird about the Mini's input lag is that it is the same for all incoming resolutions and in all modes. Shouldn't deinterlacing be at least one frame more of input lag than multiplying lines of progressive sources?
 

Sixfortyfive

He who pursues two rabbits gets two rabbits.
What's weird about the Mini's input lag is that it is the same for all incoming resolutions and in all modes. Shouldn't deinterlacing be at least one frame more of input lag than multiplying lines of progressive sources?

Deinterlacing only takes extra time when the display tries to interpolate or guess what the "missing" lines of the picture are "supposed" to look like. This would require a comparison to the previous and following frames in order to be done well and would take extra time to do, yes.

The XRGB-1, XRGB-2, and XRGB-3 (B1 mode) don't even try to interpolate those missing lines, even in genuine 480i video, where that method would be appropriate. When they receive an odd field, they fill in the even lines with a copy of the odd lines. When they receive an even field, they fill in the odd lines with a copy of the even lines. Hence the term "line-doubling." This produces a slight wobbling or shaking effect when you give them 480i video (which they correct a little bit by shifting the odd frames slightly downward and/or the even frames slightly upward).

It is weird, though, that the Framemeister's handling of progressive resolutions seems to be no faster than its genuine deinterlacing method (actual interpolation).
 

Galdelico

Member
Your TV probably has a bad deinterlacing algorithm and also probably treats 240p as if it was interlaced.
I see, thank you for the clarification.
Not a big deal, since I'm not gonna use the LCDs (a few years old Panasonic Viera TVs) for retrogaming anyway, but good to know why they both actually suck. :D
 

Khaz

Member
When they receive an odd field, they fill in the even lines with a copy of the odd lines. When they receive an even field, they fill in the odd lines with a copy of the even lines. Hence the term "line-doubling."

I wonder what would the picture look like if the empty line wasn't filled with a copy of the above line but instead kept blank, or black. I suppose the picture would be dimmer, but wouldn't it give a better look overall?
 

Sixfortyfive

He who pursues two rabbits gets two rabbits.
I wonder what would the picture look like if the empty line wasn't filled with a copy of the above line but instead kept blank, or black. I suppose the picture would be dimmer, but wouldn't it give a better look overall?

Maybe, but the trembling isn't really that noticeable as it is unless you're up close to the screen or looking at static text/HUDs. It's not really any worse than watching interlaced video on actually interlaced screens, imo. I also think you'd effectively cut the brightness in half by doing this; the strobing option on modern Sony Bravias (Impulse Motionflow) does something like this and it's basically unwatchable in any setting other than a pitch black room.

But I kind of wonder what it would look like if, instead of blanking out the previous field, the scaler actually kept the previous field in memory and drew it to the screen twice in a row, actually mimicking real interlaced video. (When I was making the videos for the OP, I wanted to find a filter that did this and use it on the 480i captures because I think that would have been the most authentic way to present those videos, but I never came across one, so I just used a Yadif filter instead.)
 
But I kind of wonder what it would look like if, instead of blanking out the previous field, the scaler actually kept the previous field in memory and drew it to the screen twice in a row, actually mimicking real interlaced video.
Yes why not this? It seems, from my lay perspective, to be less work for the processor to repeat draw each odd or even frame rather than copying and shifting every frame, and that the result would be better. The way you say it works, wouldn't you be able to notice everything moving up and down relative to where it would be in a constant, progressive display, by the height of a line?
 

Peagles

Member
Does anyone own a Toro for the Dreamcast? How long did it take to get to you once it shipped?

I ordered one back in October but it still hasn't arrived. I've been emailing them for weeks about it. They said they have to wait another week before they can make a claim or something. So much for thinking I had ordered early enough for Christmas.
 

Mega

Banned
I got a great deal today on a 17" JVC HD CRT. I'm excited to see Wii, GC and PS2 games in 480p component!

Funny coincidence with the current topic in this thread... what do I need for the Dreamcast to best take advantage of this HD CRT? A Kuro box? Not sure if I need the extra options of the Toro but maybe I'm overlooking something.
 
I ordered one last month and it arrived like two weeks ago. <_<

Actually not sure it was the right purchase to make, since all I really need is 480i SCART, but can't hurt to have a VGA Box handy if I decide to make any changes in the future.
 

Huggers

Member
Finally managed to find a Scart amplifier/splitter. Shot up in price at the end of the auction but they're so hard to find these days just glad I managed to nab it
 

HTupolev

Member
I wonder what would the picture look like if the empty line wasn't filled with a copy of the above line but instead kept blank, or black. I suppose the picture would be dimmer, but wouldn't it give a better look overall?
You mean just draw "scanlined" even/odd frames?

Seems likely that it would crank all the interlacing artifacts to 11, but it would be interesting to see.

But I kind of wonder what it would look like if, instead of blanking out the previous field, the scaler actually kept the previous field in memory and drew it to the screen twice in a row, actually mimicking real interlaced video.
Yes why not this? It seems, from my lay perspective, to be less work for the processor to repeat draw each odd or even frame rather than copying and shifting every frame, and that the result would be better. The way you say it works, wouldn't you be able to notice everything moving up and down relative to where it would be in a constant, progressive display, by the height of a line?
That's basically what "weave" deinterlacing does in VLC. It looks horrible, the combing artifacts are ridiculous.
 
Is there any good option out there at all currently for a 6-8 scart switch? I just jumped on board and ordered a framemeister. And of course retrogamecave sells out of the sega trio power supply as soon as I want to get one.
 

Teknoman

Member
About those scart to yuv switches... is it possible they'd wear down over time with constant switching of scart plugs? Nothing like playing a few minutes and then jumping to another system, but just wondering about overall wear and tear.

Being extra careful with this one, even though i'm not sure what causes visual issues in the first one I had years ago.
 
About those scart to yuv switches... is it possible they'd wear down over time with constant switching of scart plugs?
I don't personally know but have been told in this thread over and over that SCART connections are the weakest of connections and the soldered pins can get pulled out relatively easily. If you could have your systems that you used often going into a SCART switch first and then the SCART out into the transcoder that would drastically lower your chance of problems, right?
 

Teknoman

Member
I don't personally know but have been told in this thread over and over that SCART connections are the weakest of connections and the soldered pins can get pulled out relatively easily. If you could have your systems that you used often going into a SCART switch first and then the SCART out into the transcoder that would drastically lower your chance of problems, right?

If it didnt degrade the signal, maybe? That actually doesnt sound like such a bad idea...
 
If it didnt degrade the signal, maybe? That actually doesnt sound like such a bad idea...
This thread covers which switches don't degrade signal. Bandridge is good (4 inputs, not being made anymore), the new one from AssemblerGames is great (8 inputs, waiting list) and the Hama and it's clones have mixed reviews (3 inputs, generally available on eBay).
I have that transcoder too and am planning on using it for a kind of arcade cabinet setup with systems connected through a SCART switch to the transcoder and into my Sony KV32 FV310. I haven't gotten a SCART switch yet because of the availability problems.
 

Adam Blue

Member
I don't know if this has been mentioned, but retro gaming cables UK might be making a switch to sell. They sent out a survey asking what customers would want from a switch.
 

HTupolev

Member
Pretty sure that "weave" means that there's no deinterlacing at all. That's how it works in most programs, anyway.
There's no such thing as "no deinterlacing" if you're converting data from an interlaced signal onto a progressive signal. Sticking the most recent even field into the even lines of a buffer, and the most recent odd field into the odd lines of a buffer, and displaying that buffer progressively, is still a form of deinterlacing.

(And, correct me if I'm wrong, but it seems to be exactly what you described.)
 

Sixfortyfive

He who pursues two rabbits gets two rabbits.
There's no such thing as "no deinterlacing" if you're converting data from an interlaced signal onto a progressive signal. Sticking the most recent even field into the even lines of a buffer, and the most recent odd field into the odd lines of a buffer, and displaying that buffer progressively, is still a form of deinterlacing.

(And, correct me if I'm wrong, but it seems to be exactly what you described.)

What I'm looking for is something like this.

This is what you're talking about.
 

Rongolian

Banned
Well, I've been really trying to narrow down how to make my XRGB Mini's scanlines look as BVM-like as possible. I truly think, for retro gaming, the BVM gets scanlines just right. I used photos like the one taken by Fudoh below for reference:

scan_bvm.jpg


Using FBX's SNES 4x integer scaling profile, input RGB via SCART, output at 1080p, without scanlines, the output looks like so:




After using the recommended settings for scanlines, INT_LINE: 80 and INT_SMOOTH: 100, the output gives pretty washed out scanlines:



After tweaking with the settings, setting INT_LINE: 0, INT_SMOOTH: 90, and boosting the Gamma to 30 to compensate for darkening of the image, I get these results:




Keep in mind these were all taken with a cell phone camera, but you can see that you can get very close to BVM-like scanlines by modifying these two scanline parameters.

I'll probably keep tweaking them, but for now I'm pretty happy with these results. Feel free to try these settings, and let me know if you have any additional tweaks!
 

missile

Member
My brand-new toy. A mini (oscilloscope) CRT.

BEswaZ3.jpg


u1vMgeb.jpg


D4kTmkC.jpg


Want to experiment with this,. i.e. producing a raster and stuff.

Well, I'm searching for mini-RGB CRTs < 4" (the smaller, the better). Does
anyone know where to get one of those or where such CRTs where installed?
I was looking at viewfinder (video camera) CRTs, but they are all b/w,
haven't found a color one yet. Anyone knows a video camera where the
viewfinder was in color? Other devices?
 

Morfeo

The Chuck Norris of Peace
Finally got all the SCART cables I need for my systems. Now I just need that 8 SCART switch to come out so I can plug them all in.

This has taken forever, wonder if it will even be commercially available by the end of next year? Sure hope so!
 
Status
Not open for further replies.
Top Bottom