The affected tests fuzzily compare the results of using sized depth
textures (GL_DEPTH_COMPONENT{16,24,32}) with unsized depth texture
(GL_DEPTH_COMPONENT), which is used as reference.
According to OpenGL ES, the formers write the depth in R channel, (D,
0.0, 0.0, 1.0), while the later writes the depth in the three RGB
channels (D, D, D, 1.0).
Roughly speaking, the image in the former case is red-ish, while the
second is grayscale, and thus the comparison fails.
As only the depth is interested, just modify the test to write the red
channel in the last case.
To allow comparing them, let's just use one channel in the reference.
Components: OpenGL
VK-GL-CTS issue: 999
Affects:
KHR-GLES2.core.internalformat.texture2d.depth*
KHR-GL4*.internalformat.texture2d.depth*
Change-Id: I0bd9eea951eec0dcf91b072c476fbe0ae5292df1
"varying highp vec2 texcoord;\n"
"void main()\n"
"{\n"
- " gl_FragColor = texture2D(sampler, texcoord);\n"
+ " highp vec4 color = texture2D(sampler, texcoord);\n"
+ " gl_FragColor = ${CALCULATE_COLOR};\n"
"}\n";
+
+ if (internalFormat == GL_DEPTH_COMPONENT)
+ specializationMap["CALCULATE_COLOR"] = "vec4(color.r, 0.0, 0.0, 1.0)";
+ else
+ specializationMap["CALCULATE_COLOR"] = "color";
}
vs = tcu::StringTemplate(vs).specialize(specializationMap);