Hello,
Two years ago I locally fixed some bug with Civilization III game, caused by problems with 16bpp format support in OpenGL on joint of Wine and Mesa libraries. I described my findings here
https://bugs.winehq.org/show_bug.cgi?id=41930#c60
Now realized that may be bugtracker alone is not enough, and may be writing to this e-mail list is better.
I'm wondering if anybody interesting in fixing this issue? I will be glad to work with somebody to fix this problem in the main Wine brunch (just don't know correct workflow for upstreaming and discussing fixes like that).
With best regards, Vyacheslav Chigrin
On 6/20/25 18:45, vyacheslav.chigrin@izba.dev wrote:
Hello,
Two years ago I locally fixed some bug with Civilization III game, caused by problems with 16bpp format support in OpenGL on joint of Wine and Mesa libraries. I described my findings here
https://bugs.winehq.org/show_bug.cgi?id=41930#c60
Now realized that may be bugtracker alone is not enough, and may be writing to this e-mail list is better.
I'm wondering if anybody interesting in fixing this issue? I will be glad to work with somebody to fix this problem in the main Wine brunch (just don't know correct workflow for upstreaming and discussing fixes like that).
With best regards, Vyacheslav Chigrin
Well that's perfect timing because we've recently dropped OSMesa in favor of PBuffer rendering through the GLX/EGL stack.
Turns out that this also broke the game, see https://bugs.winehq.org/show_bug.cgi?id=58384, which I was looking into earlier, but that should be fixed with https://gitlab.winehq.org/wine/wine/-/merge_requests/8384
I think https://gitlab.winehq.org/wine/wine/-/merge_requests/8383 might also be useful here.
As far as I can see, with both MR I don't see any terrain turning black anymore, because... it turns stripped orange instead (is this progress?).
In short, memory DC are now using the accelerated GL stack to do their drawing, with some synchronization points where the bitmap is read into the GL front buffer, when a GL context is set current, or the other way around from the front buffer to the selected bitmap, when the GL context is flushed.
This seems to be how Windows does it too, although we might be missing some sync points. It's not entirely clear when the data is being flushed in one way or the other on Windows.
We have a couple of tests about it and I would suggest to have a look at `test_bitmap_rendering` in `dlls/opengl32/tests/opengl.c`, and maybe see if you can reproduce something similar to what the game is doing and find behavior differences with Windows.
The pixel formats are enumerated from the Linux GL configs, and it's possible that there's some things to tweak there to match what memory DCs are supposed to enumerate, but other than that the pbuffer is created directly from the pixel format that was selected by the game.
Here too, it should be possible to write tests to compare Windows vs Wine, and change our implementation to better match it.
Cheers,