Fix internal format/type for ES3 3D + depth/stencil negative API tests.
This is a port of commit
ae7f8e0a07730e693b24d3dc7a23d2372319145e from
the ES 3.1 tests to the ES 3.0 tests.
According to the ES 3.2 specification:
"Textures with a base internal format of DEPTH_COMPONENT, DEPTH_STENCIL
or STENCIL_INDEX are supported by texture image specification commands
only if target is TEXTURE_2D, TEXTURE_2D_MULTISAMPLE, TEXTURE_2D_ARRAY,
TEXTURE_2D_MULTISAMPLE_ARRAY, TEXTURE_CUBE_MAP or TEXTURE_CUBE_MAP_ARRAY.
Using these formats in conjunction with any other target will result in
an INVALID_OPERATION error."
This subtest tried to check the above error condition, but it specified
GL_DEPTH_STENCIL / GL_DEPTH_COMPONENT as format, rather than internalFormat.
Since the above text calls out "base internal format", we should specify
it as internalFormat.
We also change GL_DEPTH_STENCIL to use GL_UNSIGNED_INT_24_8 rather than
GL_UNSIGNED_BYTE, as that combination was illegal for a different reason
than the one the test intended to check.
Affects dEQP-GLES3.functional.negative_api.texture.teximage3d.
Bug:
34103293
Change-Id: Ie01e2d130bb1cadc821153487e3e41593e3ca15e