Welcome to the MacNN Forums.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

You are here: MacNN Forums > Community > MacNN Lounge > Severe banding with Denon home theater receiver

Severe banding with Denon home theater receiver
Thread Tools
tooki
Admin Emeritus
Join Date: Oct 1999
Location: Zurich, Switzerland
Status: Offline
Reply With Quote
Nov 23, 2009, 06:09 PM
 
Hey everyone,

I'm posting on behalf of a frustrated coworker, who's having trouble with his new HDTV and Denon receiver. I was hoping someone here might have some ideas on what the problem is and how to fix it.

He got a new Samsung 46" LCD TV (series 8 I believe), a Denon Blu-Ray player, and a Denon AVR-1910 receiver.

The Blu-Ray, cable box, and media center PC are all connected to the receiver via HDMI, and the receiver is also connected to the TV using HDMI.

The receiver is introducing severe banding into many gradients -- bands of color noticeably lighter than the surrounding colors of the gradient. Connecting the video source directly to the TV eliminates the banding. Here are photos of the effect. (There is slight moiré from photographing, please ignore that.)


Do any of you home theater gurus have any idea? Neither my coworker nor I have been able to find any references to this online.


Thanks!
antonio


Direct from Blu-Ray to TV:


Via receiver:




Direct from Blu-Ray:


Via receiver:




Direct from Blu-Ray:


Via receiver:




Direct from Blu-Ray:


Via receiver:


(Original image sources: http://www.flickr.com/photos/pfmeure...7622854666968/ )
     
iM@k
Senior User
Join Date: Jun 2007
Location: Manch-Vegas, NH
Status: Offline
Reply With Quote
Nov 23, 2009, 06:27 PM
 
Hmmm, you know what it like? It looks like what happened in OS 9 when you reduced the number of colors displayed on a monitor and it gradient to compensate.
What, me worry?
     
olePigeon
Clinically Insane
Join Date: Dec 1999
Status: Offline
Reply With Quote
Nov 23, 2009, 06:33 PM
 
Does the receiver do any sort of compression, scaling, etc. before it sends the video on?

Do you have analogue hookups you can try to see if it does the banding still?
"…I contend that we are both atheists. I just believe in one fewer god than
you do. When you understand why you dismiss all the other possible gods,
you will understand why I dismiss yours." - Stephen F. Roberts
     
Oneota
Professional Poster
Join Date: May 2000
Location: Urbandale, IA
Status: Offline
Reply With Quote
Nov 23, 2009, 06:41 PM
 
My thoughts are some sort of mis-match on the resolutions -- like the Denon thinks the TV is only capable of 720p and is scaling things down to that res before sending it to the TV, which then has to upscale back to 1080p. Make sure that every device along the entire path is set to 1080p, or turn off the processing/scaling options altogether, if you can (be sure to check both the Blu-Ray player and the receiver).
"Yields a falsehood when preceded by its quotation" yields a falsehood when preceded by its quotation.
     
olePigeon
Clinically Insane
Join Date: Dec 1999
Status: Offline
Reply With Quote
Nov 23, 2009, 06:53 PM
 
Can I just say that HDMI sucks?
"…I contend that we are both atheists. I just believe in one fewer god than
you do. When you understand why you dismiss all the other possible gods,
you will understand why I dismiss yours." - Stephen F. Roberts
     
Oneota
Professional Poster
Join Date: May 2000
Location: Urbandale, IA
Status: Offline
Reply With Quote
Nov 23, 2009, 07:21 PM
 
Originally Posted by olePigeon View Post
Can I just say that HDMI sucks?
As long as you're using a good quality cable, I've never had any problems with it. I've had an inferior-quality cable introduce snow-like pixelation and dropped frames before, but swapping in a better cable cleared it right up.

Tooki -- the quality of the cables might be another thing worth double-checking. If you've got known-good cables from another setup, try swapping them in and see if things improve.
"Yields a falsehood when preceded by its quotation" yields a falsehood when preceded by its quotation.
     
residentEvil
Professional Poster
Join Date: Jan 2000
Location: Detroit
Status: Offline
Reply With Quote
Nov 23, 2009, 07:57 PM
 
i'd say the receiver/source connections didn't pick the "optimal" setting. when you connect directly to the TV via hdmi, it knows what the TV can/can't do. but going through the receiver, it can't.

and the obvious thing to try, when you do:

cable ->(cable 1) receiver -> (cable 4) tv
media ->(cable 2) receiver -> (cable 4) tv
blu-ray ->(cable 3) receiver -> (cable 4) tv

are you using cable 1/2/3 to connect one device at a time directly to the TV that makes a good picture? or you using cable 4 to do it? if you are using 1/2/3 to go direct and never cable 4...maybe cable 4 is bad? try using cable 4 to make the connection for a component, does it look bad with it?
     
olePigeon
Clinically Insane
Join Date: Dec 1999
Status: Offline
Reply With Quote
Nov 24, 2009, 01:57 AM
 
Originally Posted by Oneota View Post
As long as you're using a good quality cable, I've never had any problems with it. I've had an inferior-quality cable introduce snow-like pixelation and dropped frames before, but swapping in a better cable cleared it right up.
Because it sucks.
"…I contend that we are both atheists. I just believe in one fewer god than
you do. When you understand why you dismiss all the other possible gods,
you will understand why I dismiss yours." - Stephen F. Roberts
     
lexapro
Baninated
Join Date: Mar 2008
Status: Offline
Reply With Quote
Nov 24, 2009, 03:15 AM
 
Disengage the cloaking device.
     
CRASH HARDDRIVE
Addicted to MacNN
Join Date: May 2001
Location: Zip, Boom, Bam
Status: Offline
Reply With Quote
Nov 24, 2009, 03:41 AM
 
Eeesh. That's pretty awful.

I notice the receiver manual says:

Functions usable with HDMI connections

Deep Color
Eliminates on-screen color banding, for smooth tonal transitions and
subtle gradations between colors.

x.v.Color
Enables displays with natural, vivid colors. “x.v.Color” is a Sony
registered trademark.

NOTE:
These functions will not work if the device connected to the HDMI
terminal does not support Deep Color or x.v.Color signal transfer or
the Auto Lip Sync function.
So does the reciever have a Deep Color HDMI setting?
     
CRASH HARDDRIVE
Addicted to MacNN
Join Date: May 2001
Location: Zip, Boom, Bam
Status: Offline
Reply With Quote
Nov 24, 2009, 03:42 AM
 
Originally Posted by olePigeon View Post
Can I just say that HDMI sucks?
Just curious what you would consider a better alternative?
     
Oneota
Professional Poster
Join Date: May 2000
Location: Urbandale, IA
Status: Offline
Reply With Quote
Nov 24, 2009, 10:19 AM
 
Originally Posted by olePigeon View Post
Because it sucks.
If by "it," you mean the specific, crappy cable, then yes. HDMI, in general, is pretty spiffy. Yes, HDCP is a crock of sh!t that I wish we didn't have to live with, but so are Somali pirates, tsunamis, and the IRS, so I guess life isn't all roses and puppy dogs.
"Yields a falsehood when preceded by its quotation" yields a falsehood when preceded by its quotation.
     
Laminar
Posting Junkie
Join Date: Apr 2007
Location: Iowa, how long can this be? Does it really ruin the left column spacing?
Status: Offline
Reply With Quote
Nov 24, 2009, 10:21 AM
 
Originally Posted by CRASH HARDDRIVE View Post
Just curious what you would consider a better alternative?
Composite.
     
Oneota
Professional Poster
Join Date: May 2000
Location: Urbandale, IA
Status: Offline
Reply With Quote
Nov 24, 2009, 10:47 AM
 
Originally Posted by Laminar View Post
Composite.
…Going into an RF modulator, with the TV turned to Channel 3.
"Yields a falsehood when preceded by its quotation" yields a falsehood when preceded by its quotation.
     
Laminar
Posting Junkie
Join Date: Apr 2007
Location: Iowa, how long can this be? Does it really ruin the left column spacing?
Status: Offline
Reply With Quote
Nov 24, 2009, 11:00 AM
 
Originally Posted by Oneota View Post
…Going into an RF modulator, with the TV turned to Channel 3.
It worked for the Sega Genesis, I don't see why you have to get greedy.
     
tooki  (op)
Admin Emeritus
Join Date: Oct 1999
Location: Zurich, Switzerland
Status: Offline
Reply With Quote
Nov 24, 2009, 04:05 PM
 
Originally Posted by olePigeon View Post
Does the receiver do any sort of compression, scaling, etc. before it sends the video on?

Do you have analogue hookups you can try to see if it does the banding still?
The receiver does upscaling from all sources to 1080p, so the scaler chip is of course my number 1 suspect. That said, the Blu-Ray player is outputting 1080p, so there's no scaling to do.

We haven't tried it with analog, but since Blu-Ray won't output 1080p on analog, that's not really an option.

Originally Posted by Oneota View Post
My thoughts are some sort of mis-match on the resolutions -- like the Denon thinks the TV is only capable of 720p and is scaling things down to that res before sending it to the TV, which then has to upscale back to 1080p. Make sure that every device along the entire path is set to 1080p, or turn off the processing/scaling options altogether, if you can (be sure to check both the Blu-Ray player and the receiver).
Everything is set to 1080p, and the scaler cannot be turned off. These are not resolution-dependent problems -- they appear on SD programming from DVD or the Media Center as well. If you look at the bands, they are not just the lack of smooth gradations, they are actually lighter than the colors on both sides of the band.

Originally Posted by olePigeon View Post
Can I just say that HDMI sucks?
Yes. It doth suck.

Originally Posted by Oneota View Post
As long as you're using a good quality cable, I've never had any problems with it. I've had an inferior-quality cable introduce snow-like pixelation and dropped frames before, but swapping in a better cable cleared it right up.

Tooki -- the quality of the cables might be another thing worth double-checking. If you've got known-good cables from another setup, try swapping them in and see if things improve.
and
Originally Posted by residentEvil View Post
i'd say the receiver/source connections didn't pick the "optimal" setting. when you connect directly to the TV via hdmi, it knows what the TV can/can't do. but going through the receiver, it can't.

and the obvious thing to try, when you do:

cable ->(cable 1) receiver -> (cable 4) tv
media ->(cable 2) receiver -> (cable 4) tv
blu-ray ->(cable 3) receiver -> (cable 4) tv

are you using cable 1/2/3 to connect one device at a time directly to the TV that makes a good picture? or you using cable 4 to do it? if you are using 1/2/3 to go direct and never cable 4...maybe cable 4 is bad? try using cable 4 to make the connection for a component, does it look bad with it?
Given what I know about the failure more of DVI/HDMI signals, I don't think cable quality could cause these specific issues, which appear to me to be a glitch in a gamma curve.

Originally Posted by CRASH HARDDRIVE View Post
Eeesh. That's pretty awful.

I notice the receiver manual says:
[snip]
So does the reciever have a Deep Color HDMI setting?
There's no setting for it in the receiver, just the TV and Blu-Ray. It's activated on both, but turning it off made no difference.

That said, a Deep Color signal being fed to a non-Deep Color TV should result in no image, not just banding.

AFAIK, there isn't any Deep Color source material anyway.

thanks so far!
antonio
     
Oneota
Professional Poster
Join Date: May 2000
Location: Urbandale, IA
Status: Offline
Reply With Quote
Nov 24, 2009, 05:24 PM
 
In that case, I would suspect the scaler in the receiver, like you said. Since it's new, try exchanging it at the store and see if the issue crops up with the new unit.
"Yields a falsehood when preceded by its quotation" yields a falsehood when preceded by its quotation.
     
shabbasuraj
Mac Elite
Join Date: Aug 2003
Status: Offline
Reply With Quote
Nov 24, 2009, 06:16 PM
 
There is something wrong with that receiver. Unless of course you are using a dollar store HDMI cable..
blabba5555555555555555555555555555555555555
     
tooki  (op)
Admin Emeritus
Join Date: Oct 1999
Location: Zurich, Switzerland
Status: Offline
Reply With Quote
Nov 25, 2009, 07:55 PM
 
For 6 foot lengths, even dollar-store HDMI should do fine. Regardless, HDMI cable problems cannot cause this type of problem, because the problems cannot affect certain colors or brightness ranges specifically -- they affect all bits in the datastream (be they image, audio, or control) indiscriminately. In HDMI/DVI, that results in "glitter" effects, which is not what we have here.
     
wolfen
Mac Elite
Join Date: Jul 2002
Location: On this side of there
Status: Offline
Reply With Quote
Nov 26, 2009, 10:49 PM
 
I am clearly not the home theater expert, but is there a compelling reason to put your video through the receiver at all? I have always avoided doing so.
Do you want forgiveness or respect?
     
olePigeon
Clinically Insane
Join Date: Dec 1999
Status: Offline
Reply With Quote
Nov 27, 2009, 03:59 AM
 
Originally Posted by CRASH HARDDRIVE View Post
Just curious what you would consider a better alternative?
Pretty much anything not DRMed and includes a cable that doesn't fall out of the god damn TV when I adjust its position.

Personally, if I could choose the standards, I'd pick one that involves both digital and analogue. The digital signal would use a fiber optic cable simply because the theoretical maximum throughput is as fast as the hardware that pushes the signal. You'd never need a new cable. The other half of the standard would be the analogue signal over a properly shielded, component video & audio. None of it would be encrypted.
"…I contend that we are both atheists. I just believe in one fewer god than
you do. When you understand why you dismiss all the other possible gods,
you will understand why I dismiss yours." - Stephen F. Roberts
     
olePigeon
Clinically Insane
Join Date: Dec 1999
Status: Offline
Reply With Quote
Nov 27, 2009, 04:06 AM
 
Originally Posted by Oneota View Post
If by "it," you mean the specific, crappy cable, then yes. HDMI, in general, is pretty spiffy. Yes, HDCP is a crock of sh!t that I wish we didn't have to live with, but so are Somali pirates, tsunamis, and the IRS, so I guess life isn't all roses and puppy dogs.
So long as the MPAA doesn't get their wish to kill off analogue, they can have their sh*tty HDMI cables and HDCP crap. I'll stick with component.
"…I contend that we are both atheists. I just believe in one fewer god than
you do. When you understand why you dismiss all the other possible gods,
you will understand why I dismiss yours." - Stephen F. Roberts
     
ghporter
Administrator
Join Date: Apr 2001
Location: San Antonio TX USA
Status: Offline
Reply With Quote
Nov 27, 2009, 09:32 AM
 
I think it's obvious that your receiver is processing the video and not doing it well. Have you tried component through the receiver (if that's an option) to see if you get the same effect?

I get this sort of effect with my composite video-only satellite receiver going into my TV, but then I got that on my old analog-only TV as well. It's got to be a compression artifact with the satellite receiver.

Glenn -----OTR/L, MOT, Tx
     
hayesk
Guest
Status:
Reply With Quote
Nov 27, 2009, 02:05 PM
 
Originally Posted by Laminar View Post
Composite.
I'm going to go out on a limb here and assume you meant to say Component.
     
Person Man
Professional Poster
Join Date: Jun 2001
Location: Northwest Ohio
Status: Offline
Reply With Quote
Nov 27, 2009, 02:14 PM
 
Originally Posted by wolfen View Post
I am clearly not the home theater expert, but is there a compelling reason to put your video through the receiver at all? I have always avoided doing so.
Yes. With Blu-Ray, if you want the lossless audio formats (Linear PCM, Dolby TrueHD or DTS Master Audio) to be passed to the receiver for decoding then you must use HDMI. Also, the "powers that be" who control Blu-Ray have mandated that the high resolution codecs can only be sent over a "protected data path," which means encryption, like HDCP. HDMI fits the bill for all of that. Most Blu-Ray players only have one HDMI port since it carries both sound and video data. Perhaps in the future they'll include two HDMI ports (one for video, one for audio... but that defeats the purpose of having one cable carry everything).

My receiver (Sony) will upconvert non 1080p sources if I choose, or pass the video signal straight through with no processing. No problems with my setup at all. One cable from Blu-Ray player to Receiver, and one from receiver to TV.
     
lexapro
Baninated
Join Date: Mar 2008
Status: Offline
Reply With Quote
Nov 27, 2009, 02:23 PM
 
I really don't know what any of you are talking about. I don't even have a TV!
     
Laminar
Posting Junkie
Join Date: Apr 2007
Location: Iowa, how long can this be? Does it really ruin the left column spacing?
Status: Offline
Reply With Quote
Nov 27, 2009, 03:30 PM
 
Originally Posted by hayesk View Post
I'm going to go out on a limb here and assume you meant to say Component.
I know what i said.

     
wolfen
Mac Elite
Join Date: Jul 2002
Location: On this side of there
Status: Offline
Reply With Quote
Nov 27, 2009, 09:22 PM
 
Originally Posted by Person Man View Post
Yes. With Blu-Ray, if you want the lossless audio formats (Linear PCM, Dolby TrueHD or DTS Master Audio) to be passed to the receiver for decoding then you must use HDMI. Also, the "powers that be" who control Blu-Ray have mandated that the high resolution codecs can only be sent over a "protected data path," which means encryption, like HDCP. HDMI fits the bill for all of that. Most Blu-Ray players only have one HDMI port since it carries both sound and video data. Perhaps in the future they'll include two HDMI ports (one for video, one for audio... but that defeats the purpose of having one cable carry everything).

My receiver (Sony) will upconvert non 1080p sources if I choose, or pass the video signal straight through with no processing. No problems with my setup at all. One cable from Blu-Ray player to Receiver, and one from receiver to TV.
So my PS3 HDMI-connected to the plasma, with an optical audio line from the plasma to the receiver, is somehow lacking in its performance relative to routing everything through my receiver first?? That's weird.

Got any opinions about Blu Ray's odds of survival?
Do you want forgiveness or respect?
     
jersey
Senior User
Join Date: Dec 2002
Status: Offline
Reply With Quote
Nov 29, 2009, 10:12 PM
 
I, like others, vote receiver.

I have that exact Denon model, hooked to a comparable tv, and have no issues.

The x.v. and deep color should eliminate banding, and the receiver supports it. Of course, if your tv doesnt have a 10 bit panel then its all useless, but the new 8 series Sams should.
     
climber
Mac Elite
Join Date: Dec 2001
Location: Pacific NW
Status: Offline
Reply With Quote
Nov 30, 2009, 02:23 AM
 
Originally Posted by wolfen View Post
So my PS3 HDMI-connected to the plasma, with an optical audio line from the plasma to the receiver, is somehow lacking in its performance relative to routing everything through my receiver first?? That's weird.

Got any opinions about Blu Ray's odds of survival?
Yes that is correct, for the best audio from Blu-ray, you need HDMI. As I understand the reason, the current specs for digital coax or optical audio connections there is not enough band width for all the data.

As best as I can explain it, a blu-ray disk has both a compressed core audio track similar to current DVD's and an extra layer of data that with some fancy calculations allows the player to reconstruct the audio to same un-compressed audio track as was in the studio master. The digital audio cable will work, but will default to the DVD quality core track.

Now, some will argue that most people can't differentiate between the compressed core and the technically superior track like the Dolby TrueHD track. It is sort of the same argument as the difference between a high quality MP3/AAC and the original CD. Certainly data is lost in the compression process, but it is unlikely to be heard in a pair of earbuds of on a inexpensive set of speakers. If your equipment and your ears are up to it, then an upgrade to HDMI may improve things.

The biggest problem with HDMI is poor engineering. The connector just sucks, and the handshaking issues are a nightmare. Even when things work it sucks. But right now it is the only way to get a digital video signal from a source to display. Yes you can convert it back and forth to analog via component cables, but that makes as much sense as using a VGA cable on a computer.

Blu-Ray will be around as long as people are content with purchasing physical media. I do not see a significant change in attitudes about purchasing movies and music vs just renting. People like to own stuff, I do not see that changing anytime soon. I also do not see people purchasing HD movies to store on hard drives like they do now with music. The files are just to big. I can back up a ton of music on one DVD, to back up a HD movie requires something more like....well I guess a blu-ray disk would work...but that sort of defeats the purpose. I do not see the average consumer managing that type of data on a Mac let alone in Windows. Blu-Ray will be around for a good ten years. I hope!

The original poster has a problem with the receiver. Even poor quality upscales will not damage the signal that much. Try swapping it out, I think it is broke.
climber
     
   
Thread Tools
 
Forum Links
Forum Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Top
Privacy Policy
All times are GMT -4. The time now is 09:31 PM.
All contents of these forums © 1995-2017 MacNN. All rights reserved.
Branding + Design: www.gesamtbild.com
vBulletin v.3.8.8 © 2000-2017, Jelsoft Enterprises Ltd.,