Relates :
|
|
Relates :
|
|
Relates :
|
The setDisplayMode() specification is not clear about the interpretation of the BIT_DEPTH_MULTI constant. This has become an issue now that DisplayMode switching and fullscreen exclusive mode are available on Linux, where we have historically used BIT_DEPTH_MULTI to describe the bit depth on those configurations. There are a couple questions to address here: - Should we interpret BIT_DEPTH_MULTI as "any bit depth"? For example, if an application calls setDisplayMode() with BIT_DEPTH_MULTI, should we just select any available DisplayMode that matches otherwise (width, height, refresh rate)? (This is similar to the question posed in 5041225, where we now will pick some DisplayMode if the user specifies REFRESH_RATE_UNKNOWN, as long as the mode matches otherwise (the width, height, and bit depth all match.) - Likewise, if the application tries to set a DisplayMode with a bit depth of, say 32, should we treat that as a successful match to BIT_DEPTH_MULTI? Many fullscreen apps have been written already with Windows in mind, where 32-bit DisplayModes are almost always available, so developers may be in for a surprise when they try their app on Linux, where 32-bit mode may not be enabled for their particular configuration. (Many distros default to 16-bit mode, for example...) There are no clear cut answers here, but we should give these issues some thought to help solidify the spec in this area.
|