In response to several (dozen) people asking about why the backgrounds in HW2 are displaying visible bands of different colours (reminiscent of a 16 bit background texture), Relic replied with an explanation.
Original thread containing this information can be found here.It's not running in 16bit
It's just that the process used to load the backgrounds and prepare them for brightening / dimming during the game results in some 2 colour jumps which give the banding. At least that's how it was explained to me when I asked.
So that explains the backgrounds.
The banding you see in some of the effects and the dust clouds is due to texture compression.
Remy also had made a previous comment in a demo thread on the same subject.
That thread is here.I think what you are talking about is part of HW2. In certain parts of certain backgrounds where there is small colour fade from one part to another 8bits per channel just doesn't just cut it.
You are in 32bit colour mode. That is eight bits for each of the RGBA channels. Imagine it is a red part you are talking about, maybe from light red to medium/light red.
8bits gives you 256 unique colours between bright red and dark red (AKA black). If you stretched this accross your entire screen you would have 256 colours stretched over 1600 pixels; giving you bands of colour just over 6 pixels thick! Unfortunately you'll be going from light red to medium light red, for which there may only be 16 different shades of red, but far more than 16 pixels you need to cover. So you end up with even more noticable banding.
The real solution is we need video cards with better than 32bit colour. 64bit or 128bit should be good enough.
On the release of the 1.1 patch, Remy made this comment in this thread.
Hopefully people will now understand why backgrounds appear banded.ATI and nvidia have 10bit DACs, but they do not have a 10bit per channel frame buffer. 9500+ sort of does, but I belive they unfortunately haven't been able to expose this to applications like HW2 due to some unforseen limitations. I orginally saw Radeon 9700 previews say they would have a 10:10:10:2 frame buffer mode, but I haven't seen them mention that lately.
nvidia and ati also have floating point 16bit or 32bit textures, but these are textures, not frame buffers. The banding is caused by not enough frame buffer bits. Even if HW2 rendered to a floating point texture on Radeon/GeForceFX cards with 32bits per channel it would be converted to just 8bits per channel in the end when it goes to the frame buffer and would still have banding.
By having a 10bits per channel instead of 8bits you will still have banding, but the bands will be four times more noticable on the 8bit version. So a 32bit 10:10:10:2 frame buffer will actually be more effective for reducing banding than a 128bit 32:32:32:32 texture sent to a regular 32bit 8:8:8:8 frame buffer. This is key, even though a Radeon/GeForceFX can do more precise calculations with textures internally, they still go down to a 8bit per channel frame buffer at the end while the Parhelia has 10bit per channel.
Parhelia owners are claiming that the Parhelia with its "gigacolor" 10bit per channel mode reduces banding. From the screen shots it seems to be true.
The 1.1 release is a patch to fix crashes, network bugs, and game play balance issues. It isn't out there to add new features which could potentially introduce new bugs.