This normalization was added in 02ac5e95c84a1d9a46df1dc4102342fb653e36ee, and changed to use floats in 4bf031c0646e91b35777f1ba4e2b0328063bb666. The conversion to floats means that sometimes there is insufficient precision for the normalization process, which results in values of NaN or infinity. Performing the whole process with doubles prevents that, but games also sometimes set the values to NaN or infinity directly (possibly accidentally due to the values not being initialized due to them not being used in the current configuration?).
The version of Mesa currently in use on FifoCI (20.3.5) has issues with NaN. Although this bug has been fixed (b3f3287eac in 21.2.0), FifoCI is stuck with the older version.
This change may or may not be incorrect, but it should result in the same behavior as already present in Dolphin, while working around the Mesa bug.
https://bugs.dolphin-emu.org/issues/12977 indicates that this happens on startup of Spider-Man 2, even in single-core. I don't have the game, so I can't directly determine why this is happening, but presumably real hardware does not hang in this case, so we can make it less obtrusive.
Looks like a copy-paste gone wrong. The compute shaders for the other
formats use a group size of 8 * 8, whereas the CMPR compute shader
is supposed to use a flattened 64 * 1 as I understand it.
This struct is the only one in BPMemory that uses u64 as its base. These fields are to allow viewing it as two u32s instead. It's not used by Dolphin right now, but it is used in the copy of BPMemory.h used by hwtests.
This also changes the behavior for the invalid gamma value, which was confirmed to behave the same as 2.2.
Note that currently, the gamma value is only used for XFB copies, even though hardware testing indicates it also works for EFB copies. This will be changed in a later commit.
It was named yuv in 522746b2c223f37c45569ee7fd4a226b278cb6d9, but hardware testing indicates that that bit does nothing (the intensity format bit enables YUV conversion, instead).
The only remaining casts for these types that I know of are in TextureInfo (where format_name is set to the int version of the format, and since that affects filenames and probably would break resource packs, I'm not changing it) and in TextureDecoder_Common's TexDecoder_DrawOverlay, which will be handled separately.
Adds a pass to process driver deficiencies between UID caching and use, allowing a full view of the whole pipeline, since some bugs/workarounds involve interactions between blend modes and the pixel shader