Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extension for Procedural textures #1889

Open
rsahlin opened this issue Oct 23, 2020 · 34 comments
Open

Extension for Procedural textures #1889

rsahlin opened this issue Oct 23, 2020 · 34 comments

Comments

@rsahlin
Copy link

rsahlin commented Oct 23, 2020

Investigating interest in an extension for procedural textures - IKEA_procedural_images

The purpose of this extension is to enable procedurally generated images that can be used as texture references.
The proposed solution will offer the possibility of generating textures at load time by defining new texture sources and a way of describing image generation from a set of image parameters.
One goal of this is to reduce transmission size, another goal is to be able to offer dynamic output depending - for instance altering the exact look of a wood texture at loadtime.

The initial proof of concept was to verify if the quality of procedurally generated wood textures is good enough - we are convinced that we achieved this.
The goal of this extension is to provide a generic enough descriptive syntax that other types of textures can be created.

It is NOT to create a flexible material graph, neither is forcing implementations to do texture generation in shaders (on the fly).
This shall certainly be possible, however it is not the goal for the first iteration.
Think of it as taking a subset of a procedural graph and then locking the operations so that implementations are simpler.

Overview

  • The procedural image specification will define a set of operations to be carried out in order to produce texture source images.
    You may see this as a compiled (or locked) subpart of a material nodegraph.
  • The produced source images may contain basecolor, normal, roughness and possibly occlusion.
    It is estimated that metalness is not needed since the procedural specification targets one material - but this is somewhat undecided.
  • The default behavior is that the textures are generated at load-time and sampled as any 'normal' texture source.
    It is up to implementations to dynamically generate the textures, for instance in the fragment shader, however this behavior is not mandated nor is it the goal of this first iteration of the extension.

Way forward

  • Define a list of graphical operators (add, mul, noise, stretch etc) needed to generate procedural images.
  • Define a list of known materials we believe can be represented using these operators.
  • Define a list of DCC tools / material systems that are or may be compatible with this definition.
  • Supply a detailed list of the operator declaration - what the inputs are and what the expected output is.
  • Supply some sort of test framework for implementers to validate the implementation of operators.

1: Texture reference

For a procedurally generated texture to be able to be used as a source it must be possible to reference it.
This can be done on an image level, where the procedural file is referenced.
The following will add two procedurally generated textures as source, generating the basecolor and normal when the image is loaded.
A fallback is specified, however this is probably not wanted.

"textures": [
	{
	    "source": 0,
	},
	{
	    "source": 1,
	}
]


"images": [
    {
        "uri": "fallback_wood_basecolor.png",
        "mimeType": "image/png",
        "extensions": {
            "IKEA_images_procedural_wood",
             "uri": "ikea_wood.ipw",
             "target": "basecolor"
         }
    },
    {
        "uri": "fallback_wood_normal.png"
        "mimeType": "image/png",            
        "extensions": {
            "IKEA_images_procedural_wood",
             "uri": "ikea_wood.ipw",
             "target": "normal"
         }
    }
]

Procedural generator

This part will specify some sort of declarative syntax for understanding the data provided to the generator.
Input will be one or more datasets - output is the generated image (for each of the texture maps)
It shall be (resonably) easy to implement, provide sufficient performance for real world load time usage and be able to guarantee that the visual output is the same.
It shall be possible to output different target maps such as color, normal, roughness etc.

The procedurally generated image is specified in the images array using the extension, it is recommended to require this extension or only provide a very basic image fallback (otherwise the whole purpose of reducing transmission size is lost)

Sourcecode (C99) and wasm will be provided for the generator, reducing the implementation overhead.

One part of this project will be to define the (graphic) operators that are needed to create the procedural textures.
Another important part will be to create a test and verification suite so that implementers know that they are getting the textures right.

@lexaknyazev
Copy link
Member

While supporting the feature in general, I have to comment on some details.

This could be done by adding one (or more) mime-type

MIME type is a formally defined term (administered by IANA) designated to identifying specific data formats. glTF's image objects are supposed to represent pre-existing resources (hence built-in bufferView and url options).

To accommodate the possibility of other texture sources, we deliberately made texture.source property optional so that extensions on a texture object could provide alternative ways of supplying texture data. FWIW, that property was supposed to be texture.image but we didn't manage to rename it in time for glTF 2.0 release.

Assuming that procedural definitions may be reused for different textures, the former should be stored in a new top-level array.

@rsahlin
Copy link
Author

rsahlin commented Oct 23, 2020

Assuming that procedural definitions may be reused for different textures, the former should be stored in a new top-level array.

Thanks for your insights @lexaknyazev - this is way above my level of expertise!

Could you please explain what 'a new top-level array' means and what this would look like?

@lexaknyazev
Copy link
Member

"A new top-level array" is an array defined within an extension object that is located in the root-level extensions slot. For a ratified example, see KHR_lights_punctual.

Here's how it may look for procedural textures.

{
  "extensionsUsed": [
    "IKEA_textures_procedural"
  ],
  "asset": {
    "version": "2.0"
  },
  "extensions": {
    "IKEA_textures_procedural": {
      "procedures": [
        {
          "type": "WOOD",
          ...
        }
      ]
    }
  },
  "textures": [
    {
      "extensions": {
        "IKEA_textures_procedural": {
          "procedure": 0
        }
      }
    }
  ]
}

@donmccurdy
Copy link
Contributor

This part will specify some sort of declarative syntax for understanding the data provided to the generator...

If we go down the path of supporting procedural textures, we should seriously consider existing standards like MaterialX Standard Nodes:

MaterialX Specification

I'm not suggesting that we include all of MaterialX, here — just to use similar standard nodes for procedural textures, effectively supporting the "pattern graph" subset of MaterialX. This provides very flexible inputs to the existing glTF material system (and upcoming extensions) without defining entirely new shading models. MaterialX can be used with Autodesk Standard Surface — a very good match for glTF's direction — but MaterialX also has additional features for defining custom surface shaders, and I think that should be out of our scope for the time being, as I'm not confident it fits into the realtime transmission ecosystem well right now.

Less impactfully, I'd also suggest <PREFIX>_texture_<???> as the name of the extension, to align with https://github.com/KhronosGroup/glTF/tree/master/extensions#naming.

@rsahlin
Copy link
Author

rsahlin commented Oct 23, 2020

Thanks @donmccurdy
I agree - we should try to reuse or be influenced by what is already available in the community.

I guess the first step is to understand what is already available - apart from MaterialX - what are the other (open) ways of procedurally generating textures?

@rsahlin
Copy link
Author

rsahlin commented Oct 23, 2020

"A new top-level array" is an array defined within an extension object that is located in the root-level extensions slot. For a ratified example, see KHR_lights_punctual.

Ah ok, I think I misunderstood - I thought you where referring to the mime-types.
Yes, sure I see your point.
I'm wondering if the extension should be on the Texture or Image object.

@donmccurdy
Copy link
Contributor

NVIDIA's MDL is another good example. My understanding is that it interops well with MaterialX, and maybe even uses the same standard nodes (citation needed?). I'm not aware of any other open, declarative representation of a procedural texture. I say "declarative" because you could argue that OSL, GLSL, etc. are also representations of a procedural texture, but they're not compelling to me for this use case, because they're imperative, less composable, and less portable.

@donmccurdy
Copy link
Contributor

I'm wondering if the extension should be on the Texture or Image object.

I'd recommend putting the extension on the texture object — this fits very well with how we extend for KTX and WebP textures today.

@lexaknyazev
Copy link
Member

I'm wondering if the extension should be on the Texture or Image object.

I'd strongly prefer to keep the semantics of image objects intact (i.e. they are simple resource pointers) to not interfere with packaging (think url/bufferView/base64 conversions) and compression (BasisU, WebP) tools. Keep in mind, that image.uri or image.bufferView must be always provided per the spec.

@lexaknyazev
Copy link
Member

use similar standard nodes for procedural textures, effectively supporting the "pattern graph" subset of MaterialX

The very first step could be even simpler: define and name the most-required texture patterns using something like MaterialX or MDL and just refer to those names from the extension.

@donmccurdy
Copy link
Contributor

I suspect that defining high-level texture patterns (wood, checkerboard, etc.) would turn out to be more complicated than defining a discrete set of low-level, composable nodes. MaterialX is able to provide relatively simple OSL reference implementations for each of its standard nodes, which is a big plus to me.

Practically, an engine that supports these high-level texture patterns will absolutely need to assemble shaders at loading time, whether through a node-based system or direct shader generation. If that's the case, I think we should go directly to lower-level nodes as MaterialX, MDL, and USD are doing.

@meshula
Copy link

meshula commented Oct 28, 2020

FWIW, MatX is higher level than MDL; MatX is a general purpose shader graph, and MDL is specifically a physically based model. MatX will ultimately target MDL, similarly to how it targets OSL and GLSL.

@pjoe
Copy link

pjoe commented Oct 29, 2020

Interesting topic, been thinking about something like this for a while 😄

One thing that could be worth considering is how these procedurals are created by artist - could this e.g. fit into how Blender3D would do node based procedural textures (I think there is some interest already in blender community about this - also check out this guys stuff: https://gumroad.com/simonthommes).

I suppose Substance Designer is also very much in this space.

@rsahlin
Copy link
Author

rsahlin commented Oct 29, 2020

@donmccurdy For our first step in a prototype we would choose one of the 3D tools that support the creation of wood textures and have the code readily available to reverse that 'magic data' into pixels.
We would then use this to turn the procedural data into wood textures at load time.
At this moment in time we are not aiming to solve the nodebased material system.

@rsahlin
Copy link
Author

rsahlin commented Oct 29, 2020

One thing that could be worth considering is how these procedurals are created by artist - could this e.g. fit into how Blender3D would do node based procedural textures (I think there is some interest already in blender community about this - also check out this guys stuff: https://gumroad.com/simonthommes).

This is exactly what we are aiming for in our prototype @pjoe
We want to condense this down to the absolute minimum (kiss) and start with how the wood texture would be generated in a tool such as Blender.
The next step is then to get hold of the code, or pseudocode, that turns the 'woodnode' into actual pixel data.
As a first step we are not looking into creating a nodebased material system or even generate the wood textures on the fly in the shaders. (though implementations would of course be free to do that if they want to)

@pjoe
Copy link

pjoe commented Oct 30, 2020

Great :)

I do think, like @donmccurdy that you would quickly get to the point where it makes most sense to build this from simpler 'nodes', e.g. 4D noise (which is needed for making procedural noise tiling). IIRC 4D noise nodes landed in blender 2.81.

At load time you could 'execute' the node graph, to generate 'normal' 2D textures (maybe at different resolution depending on device) and hand those of to the actual renderer.

If you haven't already looked at Substance Designer, it is a good source of inspiration for this kind of system - even though it is proprietary.

@rsahlin
Copy link
Author

rsahlin commented Oct 30, 2020

At load time you could 'execute' the node graph, to generate 'normal' 2D textures (maybe at different resolution depending on device) and hand those of to the actual renderer.

Thanks @pjoe - this sounds like what I am after :-)
Do you have any more information regarding how this would be executed from outside Blender for instance?

In order not to choke myself with too much to do I rely on others expertise when it comes to choosing the editing tool :-)
From our perspective the most important thing is to pick one tool for the prototype (eg Blender) and make a proof of concept.
Picking one that is open and has sourcecode readily available is a must.

Later on we will definately look into more complex solutions such as how to define the procedural texture generation.

@pjoe
Copy link

pjoe commented Oct 30, 2020

I don't think Blender currently has the ability to export these procedural texture nodes in any format (except .blend).

You can try asking the Blender community, might be others looking at this (e.g. https://devtalk.blender.org/, https://blender.chat/)

@donmccurdy
Copy link
Contributor

If your goal is to support a discrete list of "known" procedural textures (let's say WOOD_FINE_GRAIN, WOOD_ROUGH_GRAIN, TEXTILE, TEXTILE_MESH), and to list any allowed parameters to those patterns, you could create a reasonable workflow with custom nodes for each texture type in Blender. The glTF Blender addon is extensible, and you could define an extension that detects each custom node and just writes the appropriate parameters with this extension. There are probably similar options in Substance Designer, but I don't know it well enough to say.

For a tightly-controlled pipeline (i.e. you can make the needed changes in 1-2 DCC tools and 1-2 viewers) this should be a reasonable approach. For more widespread adoption I think it would be necessary to shift to lower-level nodes like MaterialX or MDL — I don't expect it is possible to get widespread agreement across the ecosystem on high-level abstractions like "wood".

I may open a PR describing a lower-level node-based procedural texture approach at some point, but for the moment it sounds like these are different things, yes. 👍

@donmccurdy
Copy link
Contributor

I don't think Blender currently has the ability to export these procedural texture nodes in any format (except .blend).

I've been keeping an eye on things like https://blenderartists.org/t/materialx-blender-integration/700331 and godotengine/godot-proposals#714, but at the moment I don't see any signal from Blender or Godot on how they plan to approach this problem. Blender is working on USD integration, but I'm not sure what that means in terms of node-based materials, if anythingc.

@rsahlin
Copy link
Author

rsahlin commented Nov 4, 2020

For a tightly-controlled pipeline (i.e. you can make the needed changes in 1-2 DCC tools and 1-2 viewers) this should be a reasonable approach. For more widespread adoption I think it would be necessary to shift to lower-level nodes like MaterialX or MDL — I don't expect it is possible to get widespread agreement across the ecosystem on high-level abstractions like "wood".

I agree, our first goal is to do a proof of concept that we are happy with :-)
Step two would be to expand on that and make a solution that declared the wood data in a way the community could embrace.
I fully understand that our fist goal is purely internal - however, I want to get feedback and generate interest for our future approach :-)

@gustavolsson-ikea
Copy link

gustavolsson-ikea commented Nov 5, 2020

For more widespread adoption I think it would be necessary to shift to lower-level nodes like MaterialX or MDL — I don't expect it is possible to get widespread agreement across the ecosystem on high-level abstractions like "wood".

This is exactly what me and @rsahlin have been discussing. I'm no expert on procedural generation, but I suspect that most procedural representations boil down to a limited number of math operations and noise generation primitives, i.e. a simple math expression and not a turing-complete domain specific language (I do think branching will be required though). If the representation is low-level enough, other software packages (Substance/Blender) will be able to export their procedural node graphs to this format!

Here are the sources for Blender's texture nodes for example: https://developer.blender.org/diffusion/B/browse/master/source/blender/nodes/texture/nodes/. Some of these nodes are too high-level (bricks), but the sources could be used to re-create a procedural texture that has been created in Blender and exported to a declarative format. After this proof-of-concept we can define a set of "lowest common denominator" nodes that must be present in an extension.

Another thing we have been talking about is that it would be nice for the generator to be able to use other images/textures in the glTF file as inputs in addition to simple value constants. This way, a noise/mask/grain texture could be embedded into the glTF and then used in the generation step for a procedural wood texture at load time.

I think the focus should be on load-time generation, we don't have to go to runtime evaluation to get the benefits of procedural generation. As long as we take the "pixel-centric" approach, per-pixel evaluation at runtime should be possible in the future.

@pjoe
Copy link

pjoe commented Nov 5, 2020

Right, there are a couple of fundamental building blocks for doing procedural textures, like noise (4D for tiling in uv, 6D if you e.g. also need tiling in time), shape, scatter, etc. My best reference for inspiration is as mentioned Substance Designer, see e.g.: https://www.youtube.com/playlist?list=PLB0wXHrWAmCwWfVVurGIQO_tMVWCFhnqE

For blender source, note that it has a couple of version of nodes: for Cycles and for Eevee. See e.g. 4D perlin noise shader here: https://developer.blender.org/diffusion/B/browse/master/source/blender/gpu/shaders/material/gpu_shader_material_noise.glsl$191

@pjoe
Copy link

pjoe commented Nov 5, 2020

Btw. also found my original investigation into tiling noise in Blender: https://blender.stackexchange.com/questions/135437/how-to-make-tileable-procedural-noise-texture

NOTE: 4D noise was added in 2.81 (I think) - not by me :)

@donmccurdy
Copy link
Contributor

After this proof-of-concept we can define a set of "lowest common denominator" nodes that must be present in an extension.

Just to confirm: you're saying this extension is a proof of concept and won't use "lowest common denominator" nodes, but that you think that could be a next step afterward? The latter is obviously a big project, so we can certainly start a new thread for that if you'd rather keep this issue focused on something simpler.

@rsahlin
Copy link
Author

rsahlin commented Nov 5, 2020

Just to confirm: you're saying this extension is a proof of concept and won't use "lowest common denominator" nodes, but that you think that could be a next step afterward? The latter is obviously a big project, so we can certainly start a new thread for that if you'd rather keep this issue focused on something simpler.

I would say that this issue is serves the purpose of discussing how an extension for procedural texture could be done.
From an Ikea perspective we will first focus on a proof of concept - this will most likely not define the semantics needed to parameterize procedural wood textures - this is simply to be able to move forward and verify parts of the solution, eg the quality and size gains that we can expect.
The second phase will then focus on detailing the semantics needed.

As I see it both phases are relevant for procedural textures - It's just that for the moment our focus is on the first phase.
Does this answer your question @donmccurdy ?

@donmccurdy
Copy link
Contributor

Ok, I think so. If you feel that the comments here are getting ahead of your immediate goals feel free to let us know and we can split threads as needed. :)

@rsahlin
Copy link
Author

rsahlin commented Nov 5, 2020

You do have a valid point
I think we should keep the first phase as internal to Ikea and have this thread for discussing the semantics/nodestructure for procedurally generated textures.
This way we (Ikea) can 'go about' our proof of concept and we can discuss here what we should aim for when defining the semantics. :-)

@fire
Copy link

fire commented Apr 30, 2021

If I wanted to give a MaterialX material to material mapping with a stage0 approach where one pretends that the materialx definition is a png.

Has anyone done that?

Stage 1 would be like gltf variants for the procedural properties.

@rsahlin
Copy link
Author

rsahlin commented Apr 30, 2021

@fire Hi and thanks for your comment
I am not sure I understand what you mean by pretends that the materialx definition is a png?
Do you mean that the input, the recipe if you will, to the procedural generation should be a png (and not the list of operations needed to perform the image genaration)?

Could you please elaborate?

@fire
Copy link

fire commented Apr 30, 2021 via email

@rsahlin
Copy link
Author

rsahlin commented Apr 30, 2021

Thanks for the clarification @fire
Your suggestion does not fall under this extension proposal - this extension is aimed at defining a (2D) procedural image generator.
It will not aim at affecting geometry or other parts of the material - just the provide texture input for basecolor, normal, roughness and possibly occlusion.

Hope this answers your question?

@fire
Copy link

fire commented Apr 30, 2021

A (2D) procedural image generator.

It will not aim at affecting geometry or other parts of the material - just the provide texture input for basecolor, normal, roughness and possibly occlusion.

Yes. All the possible PBR GLTF2 material parameters that are supported.

This is exactly what I want.

@rsahlin
Copy link
Author

rsahlin commented Apr 30, 2021

This is exactly what I want.

Great :-)

We are in the process of starting the second phase of the proof of concept - I will be able to share more in the coming months.
Please be patient as we are still in the analysis phase :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants