-
-
Notifications
You must be signed in to change notification settings 8000 - Fork 50
Enhancing Texture Support: Color Space and Mapping Mode #557
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
As it is now, colorspace is controlled using the
We discussed some of this in #361. The main reasons I prefer not to use
So I'm thinking about use-cases that are either scientific or special custom visualizations. But valid cases nonetheless. That said ... it is unlikely that these use-cases involve any of a PBR mesh's textures except for the
The reason why we did not put interpolation and mapping on the texture is that it allows using the same texture in different ways, depending on how you want it to look, which is typically specified by the material. The material typically knows whether exposing a mapping mode makes sense (and which ones). E.g. for an image it would not make sense.
Would these include mapping-mode support for more of the maps, or more mapping modes to chose from? |
Is it feasible for users to directly set the "Texture.colorspace" of the corresponding resource to physical space for these scientific visualization use cases? In this way, wgpu will not automatically convert the color space of the texture, leaving it entirely up to the shader to control, which should be specific to the use case for these shader programs. In fact, there is another issue here that may not be so obvious. For PBR rendering, sampling the texture first and then converting it to physical space may not be so accurate (although it may not be a big deal in most cases), because the interpolation is done in sRGB space during the sampling, while we expect the interpolation to be done in physical space.
Perhaps this is a viable solution.
Certainly, the first feature I plan to be implemented should be support for ”Equirectangular Mapping“, which is a common texture mapping mode. Additionally, other user-friendly mapping modes can be added, such as viewport-based mapping. It is important to note that such modifications will greatly facilitate more flexible future expansions.
Indeed, that makes sense. Perhaps we should abstract the concept of ”TextureResource“ (which truly corresponds to the internal ”GPUTexture“ object). The Texture object in pygfx should not only contain the texture resource, but also include the sampling behavior description of the texture resource, supporting different interpolation modes ("linear", "nearest"), different address modes ("clamp-to-edge", "repeat", "mirror-repeat"), and of course, different texture mapping modes. That is, The "Texture" object should directly determine the final result, instead of merely representing a resource. I think this will make it more convenient for users to use. |
Yes, it is feasible to do as we support now, where
Yes, good point! I think this is another argument for allowing both approaches. What about
Since PBR materials are less likely to be used in combination with fancy cases where the textures are also used for something else, we could limit support to "physical" and "texture-srgb" for its many maps. Except perhaps the main color map. The main reason would be to keep the shader code simpler. But if turns out we can generalize the code then there's no reason to exclude "srgb" (and have shader code inserted for the conversion).
So in wgpu we have The texture maps are properties of the It looks like you foresee a problem with this approach that I don't (yet?) see. Could you please explain that a bit more? |
In this scenario, could we set the texture directly to "physical" instead of "srgb"? When the colorspace property is set to “physical”, the WGPU texture format uses the default format, whereas it is set to "xxxx-srgb" only when colorspace property is set to “srgb”. This means that if we do not want WGPU to perform any texture space conversion, we just need to set the colorspace of the texture resource to "physical", then, leave the rest entirely to the shader to control.
Therefore, it seems unnecessary to distinguish the second format separately?
Yes, we need to add the functionality of mapping WGPU's "sampler" object in pygfx. What I mean is that the “Texture” object in pygfx should not only correspond to the texture in WGPU, but should also have the corresponding properties of "sampler". It represents the way a texture is ultimately presented, not just the texture resource itself.
A material may contain multiple different textures, and these different textures may have different sampler-related behaviors. I think it's best not to define these behaviors uniformly on the material, but rather to leave them to be managed by the "Texture" objects themselves. |
No, because the user most likely wants to interpret the color as srgb.
Same answer :) In summary, I think we want both a way to use an xxx-srgb texture format, as well as a way to do srgb->physical in the shader.
Ok let's summarize what we have so far:
I believe that (1) will result in an API that is unnecessarily complex for most use-cases. Now your data is at |
I’ve been working on improving pygfx’s support for glTF, aiming to make the renderer’s behavior conform to the glTF 2.0 specification. Currently, I’m looking for a solution that allows each Texture to control its behavior in the shader based on its own colorspace attribute (rather than relying solely on the colormap's colorspace attribute). In this test case (You can check this test case in PR #895 ), we encountered the issue mentioned in the discussion: "interpolation happens before sRGB decoding," which led to results that didn't match expectations. As here mentioned, modern rendering engines typically don’t perform sRGB decoding in the fragment shader. Shaders generally operate in linear space, and texture or color data should undergo space conversion before entering the shader. For |
I implemented the |
I intend to add more texture mapping support in my free time. However, before doing that, there are two issues that need to be discussed.
The first issue is one that has been mentioned before. Currently, our shaders assume that all textures of the same material are in the same color space (determined solely by the color space of the material's map property, typically sRGB), and they are uniformly converted to the physical space within the shader. However, this assumption is not always accurate. In reality, different textures within the same material may have different colorspaces. This is particularly relevant for HDR textures, as they are generally in the physical space. For example, when the material contains both the color map in the SRGB space and the HDR environment map in the physical space, we are currently unable to handle it.
My suggestion is to let each texture manage and handle its own color space, rather than handling it uniformly within the material shader. It would be convenient to utilize the built-in texture format management strategy of wgpu (xxxx-srgb).
The second issue is regarding the mapping mode of textures, which should also be managed by the textures themselves, rather than by the material. Currently, only environment map involves different mapping modes, so we added the "env_mapping_mode" attribute to the material to manage different mapping modes for environment map. However, as we expand our support for various texture functionalities, this approach becomes less desirable.
A better approach would be to add a "mapping" attribute to the Texture object, which determines the "texcoord" used when sampling the texture with the "textureSample" function in the shader (by default, it directly uses the UV attributes, but there are also some mapping modes that need to be calculated, such as the current "CUBE REFLECTION"). In light of this, I believe we need to carefully design shader templates to provide the capability of assembling shader code related to texture mapping.
The text was updated successfully, but these errors were encountered: