https://www.khronos.org/registry/OpenGL/specs/gl/GLSLangSpec.1.20.pdf
Section 4.1.3 says that hexadecimal integer literals are supported, but
Nvidia have never read a specification since their founding, so their
engineers didn't know that hexadecimal integer literals are requires to
be supported to advertise support OpenGL versions with GLSL support.