d3d12: also reject GDI-supporting pixel-formats
authorErik Faye-Lund <erik.faye-lund@collabora.com>
Tue, 16 Jun 2020 09:39:07 +0000 (11:39 +0200)
committerErik Faye-Lund <erik.faye-lund@collabora.com>
Wed, 18 Nov 2020 10:20:02 +0000 (11:20 +0100)
In theory, it's possible to request a GDI-supporting, double buffered
pixel-format, and we're not able to support this using the DXGI
swapchains. So let's return NULL here in that case as well.

Reviewed-by: Charmaine Lee <charmainel@vmware.com>
Part-of: <https://gitlab.freedesktop.org/mesa/mesa/-/merge_requests/7535>

src/gallium/winsys/d3d12/wgl/d3d12_wgl_framebuffer.cpp

index 1d283d5..68b5b7d 100644 (file)
@@ -195,7 +195,8 @@ d3d12_wgl_create_framebuffer(struct pipe_screen *screen,
 {
    const struct stw_pixelformat_info *pfi =
       stw_pixelformat_get_info(iPixelFormat);
-   if (!(pfi->pfd.dwFlags & PFD_DOUBLEBUFFER))
+   if (!(pfi->pfd.dwFlags & PFD_DOUBLEBUFFER) ||
+       (pfi->pfd.dwFlags & PFD_SUPPORT_GDI))
       return NULL;
 
    struct d3d12_wgl_framebuffer *fb = CALLOC_STRUCT(d3d12_wgl_framebuffer);