8000 Enhancing Texture Support: Color Space and Mapping Mode · Issue #557 · pygfx/pygfx · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Enhancing Texture Support: Color Space and Mapping Mode #557

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
panxinmiao opened this issue Jun 25, 2023 · 7 comments
Open

Enhancing Texture Support: Color Space and Mapping Mode #557

panxinmiao opened this issue Jun 25, 2023 · 7 comments

Comments

@panxinmiao
Copy link
Contributor

I intend to add more texture mapping support in my free time. However, before doing that, there are two issues that need to be discussed.

The first issue is one that has been mentioned before. Currently, our shaders assume that all textures of the same material are in the same color space (determined solely by the color space of the material's map property, typically sRGB), and they are uniformly converted to the physical space within the shader. However, this assumption is not always accurate. In reality, different textures within the same material may have different colorspaces. This is particularly relevant for HDR textures, as they are generally in the physical space. For example, when the material contains both the color map in the SRGB space and the HDR environment map in the physical space, we are currently unable to handle it.

My suggestion is to let each texture manage and handle its own color space, rather than handling it uniformly within the material shader. It would be convenient to utilize the built-in texture format management strategy of wgpu (xxxx-srgb).

The second issue is regarding the mapping mode of textures, which should also be managed by the textures themselves, rather than by the material. Currently, only environment map involves different mapping modes, so we added the "env_mapping_mode" attribute to the material to manage different mapping modes for environment map. However, as we expand our support for various texture functionalities, this approach becomes less desirable.

A better approach would be to add a "mapping" attribute to the Texture object, which determines the "texcoord" used when sampling the texture with the "textureSample" function in the shader (by default, it directly uses the UV attributes, but there are also some mapping modes that need to be calculated, such as the current "CUBE REFLECTION"). In light of this, I believe we need to carefully design shader templates to provide the capability of assembling shader code related to texture mapping.

@almarklein
Copy link
Member

My suggestion is to let each texture manage and handle its own color space, rather than handling it uniformly within the material shader. It would be convenient to utilize the built-in texture format management strategy of wgpu (xxxx-srgb).

As it is now, colorspace is controlled using the Textuere.colorspace property, so API wise-things are good 😄 The mesh shader indeed only looks at the map's colorspace. I agree that it makes sense to make the shader honor each texture's colorspace (where the texture data represents color).

It would be convenient to utilize the built-in texture format management strategy of wgpu (xxxx-srgb).

We discussed some of this in #361. The main reasons I prefer not to use xxxx-srgb textures (and do the conversion in the shader) are:

  • It gives control when to do the conversion. E.g. maybe some interpolation is done first, or some other operations that you want to do in srgb space. One particular example is volume rendering.
  • A user may have some data (e.g. measurements of some kind), and want to visualize the data by using the same buffer/texture as colordata on another world object.

So I'm thinking about use-cases that are either scientific or special custom visualizations. But valid cases nonetheless.

That said ... it is unlikely that these use-cases involve any of a PBR mesh's textures except for the .map. So I would be ok with introducing a Texture.colormap value that represents "use a wgpu xxxx-srgb texture for this".


A better approach would be to add a "mapping" attribute to the Texture object

The reason why we did not put interpolation and mapping on the texture is that it allows using the same texture in different ways, depending on how you want it to look, which is typically specified by the material. The material typically knows whether exposing a mapping mode makes sense (and which ones). E.g. for an image it would not make sense.

However, as we expand our support for various texture functionalities, this approach becomes less desirable.

Would these include mapping-mode support for more of the maps, or more mapping modes to chose from?

@panxinmiao
Copy link
Contributor Author

So I'm thinking about use-cases that are either scientific or special custom visualizations. But valid cases nonetheless.

Is it feasible for users to directly set the "Texture.colorspace" of the corresponding resource to physical space for these scientific visualization use cases? In this way, wgpu will not automatically convert the color space of the texture, leaving it entirely up to the shader to control, which should be specific to the use case for these shader programs.

In fact, there is another issue here that may not be so obvious. For PBR rendering, sampling the texture first and then converting it to physical space may not be so accurate (although it may not be a big deal in most cases), because the interpolation is done in sRGB space during the sampling, while we expect the interpolation to be done in physical space.

So I would be ok with introducing a Texture.colormap value that represents "use a wgpu xxxx-srgb texture for this".

Perhaps this is a viable solution.

Would these include mapping-mode support for more of the maps, or more mapping modes to chose from?

Certainly, the first feature I plan to be implemented should be support for ”Equirectangular Mapping“, which is a common texture mapping mode. Additionally, other user-friendly mapping modes can be added, such as viewport-based mapping. It is important to note that such modifications will greatly facilitate more flexible future expansions.

The reason why we did not put interpolation and mapping on the texture is that it allows using the same texture in different ways

Indeed, that makes sense.

Perhaps we should abstract the concept of ”TextureResource“ (which truly corresponds to the internal ”GPUTexture“ object). The Texture object in pygfx should not only contain the texture resource, but also include the sampling behavior description of the texture resource, supporting different interpolation modes ("linear", "nearest"), different address modes ("clamp-to-edge", "repeat", "mirror-repeat"), and of course, different texture mapping modes. That is, The "Texture" object should directly determine the final result, instead of merely representing a resource. I think this will make it more convenient for users to use.

@almarklein
Copy link
Member

Is it feasible for users to directly set the "Texture.colorspace" of the corresponding resource to physical space for these scientific visualization use cases? In this way, wgpu will not automatically convert the color space of the texture, leaving it entirely up to the shader to control, which should be specific to the use case for these shader programs.

Yes, it is feasible to do as we support now, where Texture.colorspace basically means: if this data is used to represent color, interpret it as xx (liner or srgb). My point is that its not desirable to use an xxxx-srgb texture when srgb is selected, because that would mangle the data in places where its not interpreted as color.

In fact, there is another issue here that may not be so obvious. For PBR rendering, sampling the texture first and then converting it to physical space may not be so accurate (although it may not be a big deal in most cases), because the interpolation is done in sRGB space during the sampling, while we expect the interpolation to be done in physical space.

Yes, good point!

I think this is another argument for allowing both approaches. What about Texture.colorspace can take 3 values:

  • "physical" -> value is loaded and used as a color as-is.
  • "srgb" -> value is loaded and converted to physical in the shader. (default)
  • "texture-srgb" -> an xxxx-srgb texture is used. Users who know that their data is only used as (srgb) color, can use this to benefit from correct interpolation.

Since PBR materials are less likely to be used in combination with fancy cases where the textures are also used for something else, we could limit support to "physical" and "texture-srgb" for its many maps. Except perhaps the main color map. The main reason would be to keep the shader code simpler. But if turns out we can generalize the code then there's no reason to exclude "srgb" (and have shader code inserted for the conversion).

Perhaps we should abstract the concept of ”TextureResource“ (which truly corresponds to the internal ”GPUTexture“ object). The Texture object in pygfx should not only contain the texture resource, but also include the sampling behavior [...]

So in wgpu we have texture objects for the data, and sampler objects for the interpolation and mapping. The question is how to map this on pygfx, where we have WorldObject with Geometry and Material.

The texture maps are properties of the Material (although for Image and Volume its on the geometry). The path we've taken so far is to express the sampler-related-stuff as properties of the material as well. This seems to make sense, since the Material is for defining how things look.

It looks like you foresee a problem with this approach that I don't (yet?) see. Could you please explain that a bit more?

@panxinmiao
Copy link
Contributor Author

My point is that its not desirable to use an xxxx-srgb texture when srgb is selected, because that would mangle the data in places where its not interpreted as color.

In this scenario, could we set the texture directly to "physical" instead of "srgb"? When the colorspace property is set to “physical”, the WGPU texture format uses the default format, whereas it is set to "xxxx-srgb" only when colorspace property is set to “srgb”. This means that if we do not want WGPU to perform any texture space conversion, we just need to set the colorspace of the texture resource to "physical", then, leave the rest entirely to the shader to control.

  • "physical" -> value is loaded and used as a color as-is.
  • "srgb" -> value is loaded and converted to physical in the shader. (default)
  • "texture-srgb" -> an xxxx-srgb texture is used. Users who know that their data is only used as (srgb) color, can use this to benefit from correct interpolation.

Therefore, it seems unnecessary to distinguish the second format separately?

So in wgpu we have texture objects for the data, and sampler objects for the interpolation and mapping. The question is how to map this on pygfx,

Yes, we need to add the functionality of mapping WGPU's "sampler" object in pygfx. What I mean is that the “Texture” object in pygfx should not only correspond to the texture in WGPU, but should also have the corresponding properties of "sampler". It represents the way a texture is ultimately presented, not just the texture resource itself.

The path we've taken so far is to express the sampler-related-stuff as properties of the material as well. This seems to make sense, since the Material is for defining how things look.

A material may contain multiple different textures, and these different textures may have different sampler-related behaviors. I think it's best not to define these behaviors uniformly on the material, but rather to leave them to be managed by the "Texture" objects themselves.

@almarklein
Copy link
Member

In this scenario, could we set the texture directly to "physical" instead of "srgb"?

No, because the user most likely wants to interpret the color as srgb.

Therefore, it seems unnecessary to distinguish the second format separately?

Same answer :) In summary, I think we want both a way to use an xxx-srgb texture format, as well as a way to do srgb->physical in the shader.


Yes, we need to add the functionality of mapping WGPU's "sampler" object in pygfx.

Ok let's summarize what we have so far:

  1. Adding the sampler-props to the Texture that will wrap the actual texture, so that it can be re-used with different sampler properties.
  2. Adding the sampler properties to the material, as happens now.
  3. For every Material.xx_map we add a Material.xx_sampler. This could even be a immutable object (a named tuple perhap?).

I believe that (1) will result in an API that is unnecessarily complex for most use-cases. Now your data is at object.material.map.actual_texture ?? As you point out (2) kinda breaks down for materials that have many texture maps. It'd get messy. What about (3)?

@almarklein almarklein mentioned this issue Jun 27, 2024
12 tasks
@panxinmiao
Copy link
Contributor Author
panxinmiao commented Dec 12, 2024

I’ve been working on improving pygfx’s support for glTF, aiming to make the renderer’s behavior conform to the glTF 2.0 specification. Currently, I’m looking for a solution that allows each Texture to control its behavior in the shader based on its own colorspace attribute (rather than relying solely on the colormap's colorspace attribute).

In this test case (You can check this test case in PR #895 ), we encountered the issue mentioned in the discussion: "interpolation happens before sRGB decoding," which led to results that didn't match expectations.

image

As here mentioned, modern rendering engines typically don’t perform sRGB decoding in the fragment shader. Shaders generally operate in linear space, and texture or color data should undergo space conversion before entering the shader.

For unorm format textures, we can directly use -srgb formatted textures. For other formats, their data is usually assumed to already be in linear space, so no additional processing is typically needed. I reviewed the code of both three.js and babylon.js, and neither of them handles this (i.e., for non-unorm textures, they simply ignore the colorspace attribute and assume the data is in linear space).
If we must support non-unorm textures in the sRGB space, perhaps it would be more appropriate to provide a preprocessing method outside of the rendering process to convert the data to linear space before uploading it to the GPU. The WebGPU specification includes a copyExternalImageToTexture method that allows specifying the colorspace of an image and automatically converts it according to the target texture format. However, this method doesn’t seem to be implemented in wgpu-native. I believe we should consider implementing a similar method in pygfx or wgpu-py.

@almarklein
Copy link
Member

I implemented the tex-srgb idea in #906, so that we can do things the glTF compliant way.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
0