• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

AV/HT Experts: Video Switching - use receiver or TV?

Status
Not open for further replies.
I bought a home theatre system last weekend (Onkyo TX-SR503 receiver with Polk Audio Monitor 104 5.1 bookshelf speaker set) and I'm wondering what's better for video switching, using the receiver or the TV. Currently I'm using the TV (Panasonic PT-43LC14) for video switching since it has tons of inputs (4 component, 3 composite/s-video, 1 RGB, 1 HDMI, standard coax, etc.). The Onkyo is being used only for pure audio: 3 digital audio feeds (XBox, PS2/DVD, DirecTivo) and regular stereo audio feeds from my TV and iPod.

Is there any advantage in putting all of my video switching to the receiver instead of the TV? The receiver doesn't have 'high-end' video features such as S-Video to component upconversion or HDMI but still offers a lot of video connection options though.

Overall the system sounds really great for what I want and (kinda =P) within my budget though I'm currently running only 3.1 sound (I'm figuring out how I'm gonna wire my living room for the surround speakers). Hopefully the system will sound even better after the speakers have 'broken-in' =)
 
if the receiver has enough video bandwidth and low signal interference on the video switcher, I would use the receiver toswitch (just easier and more options).

If the receiver is low bandwidth (under 100MHz) or you notice interference coming through on component then go with the TV.
 
borghe, I just checked the specs on the Onkyo receiver, it has a 50MHz bandwidth component switcher (can't find those specs on my TV). How's that?

Can anyone school me on how bandwidth rating applies to video component switching? Thanks!
 
50Mhz is fine for typical HD over component. I believe I read somewhere that 1080i over component only consumes around 40Mhz. The problem is if they ever release HD standards over component that are higher res than 1080i (1080p for example).

The Mhz is the frequency range that a signal can be carried in. So a 1080i signal (when separated) takes about 40Mhz worth of frequency range to fully detail (40Mhz video bandwidth). a 720p signal will take less bandwidth (less information). a 1080p signal will take more bandwidth (more information). 50Mhz might be bad if you are using VGA breakout cables to feed a very high res monitor (1600x1200 for example) or as I said 1080p (which in theory would require 80Mhz of bandwidth).
 
Technically, the frequency won't tell you everything.


The best thing to do is AFTER YOUR DISPLAY IS CALIBRATED ... test it. If you don't see a difference, then use the receiver if its more convenient.

The only reasons I can think of to NOT use the receiver (barring degraded reception), is if:

1) You don't use your speakers all the time, and don't want the reciever on.

2) Your display has configurable video memory for the inputs. In that case, you may want to take the time to individually calibrate each dedicated input for what you will normally have hooked up to it*.


* In general, this means you would calibrate each input set based on connection type - NOT what is hooked up. Consider picking up AVIA or Digital Video Essentials for the calibration, and using the appropriate output from a decent DVD player.
 
Status
Not open for further replies.
Top Bottom