Skip to main content

AI Texturing Properties

UModeler X provides a wide range of settings for AI texturing, which are seamlessly integrated with the Stable Diffusion web UI through the API.

The AI texturing properties that can be viewed in UModeler X are primarily the most frequently used settings in the Stable Diffusion web UI. This allows users to quickly change settings and see the results without having to go back to the web UI.

Settings adjusted in the Stable Diffusion web UI are automatically reflected in UModeler X, even if the desired property is not available in UModeler X.


Properties

ProjectionOnly

This property generates a texture of the current state of the model without using AI texturing.

Useful for previewing the camera state or current DepthMap representation of the model using the various generate buttons before applying AI texturing.

Checkpoint

This setting selects the checkpoint to use for AI texturing.

Address

This is the address to access the Stable Diffusion web UI. The default setting is http://127.0.0.1:7860/, and in most cases you will not need to change this value.

Prompt

This is a text field that specifies what you want to see when generating the texture.

NegativePrompt

A text field to specify what you want to exclude or modify from the texture that will be generated. In general, it is recommended that you use the Negative Embedding to fill in this part of the text rather than writing it yourself.

Below is a guide to setting up Negative Embedding.

CFGScale

The CFG Scale (Classifier-Free Guidance Scale) is a property that controls how closely the generated image follows the prompt.

Higher values of CFG Scale generate images that are more faithful to the prompt, but at the expense of image quality.

Conversely, a lower CFG Scale value will result in an image that follows the prompt less closely, but the image quality is likely to be higher.

Suggested Values

CFGScale

We typically recommend setting it to a value between 7 and 9.

Sampler

This setting determines the probabilistic method used to generate the texture. Each sampler handles the random parts of the texture differently, making a small difference in the final result.

Simply put, the Sampler is the setting that determines how randomness is handled during texture generation.

Recommended Sampling Method

Sampler

We mainly recommend Euler a, DPM++ SDE and DPM2 karras.

SamplingStep

This setting determines how much sampling is done during the texture generation process. The higher the number of sampling steps, the higher quality textures can be generated, but the longer the texture generation time.

Recommended Values

SamplingStep

Generally, we recommend a SamplingStep value of 20 to 25.

Restore Faces

This property is used when faces or eye parts are generated incorrectly. We recommend enabling this when facial expressions are misrepresented, especially in realistic art styles.

ImageMaxSize

This property sets the size of the generated texture.

BatchCount

Property that sets the total number of images to be generated in a row in a single generation pass.

Use SceneMap (Img2Img)

  • Layer as Inpaint Mask

    • This property applies AI texturing only to areas that are painted with a Brush on the Paint Layer. When painting, it is recommended to use a color that is primarily the color you want to represent, or a color that is similar to the surrounding colors.
  • Denoising strength

    • This property determines how much the painted area is allowed to change in content. Closer to 0, there is no change, and closer to 1, the content changes significantly.

Use Depth Control Net

A setting that enables the Depth model of the ControlNet. The Depth data represents points closer to the viewpoint in white and points farther away in black. UModeler X's AI Texturing converts the depth of the model into an image and applies it to the Depth model based on the current scene view.

This setting can be utilized to generate more accurate textures that take into account the model's three-dimensional shape.

  • ControlnetModel
    • Option to select the Depth model to use. There are multiple versions of the same model, so choose the one you want.
  • Weight
    • A slider that controls the influence of the Depth model. Values closer to 0 have less influence, while higher values have more influence.
  • Mode
    • A property that sets the emphasis between the Prompt and ControlNet.
      • Balanced: The prompt and ControlNet have equal importance.
      • My prompt is more important: The prompt is more important.
      • ControlNet is more important: The impact of the Depth model in the ControlNet is more important.
  • NearDistanceMargin
    • Option to set the near distance margin.
  • FarDistanceMargin
    • Options to set the margin for the far distance.

Extra Control Net1

This setting is enabled when you want to use an additional ControlNet model in addition to the default ControlNet model.

You can apply two or more ControlNet models in parallel or in chains to get an AI texturing result that combines the best of each model.

Hires.fix

A property that enables upscaling to quickly generate textures that are larger than the set size.

This feature allows you to generate large textures in a relatively short amount of time, but it can take too long to generate them.

  • Upscaler
    • This property selects the upscaling method used when upscaling the texture.
    • The following are recommended: Latent, ScuNET GAN, R-ESRGAN_4x, R-ESRGAN_4x+Animated 6B.
  • Upscale by
    • This property sets the scale factor to multiply the set texture size by.
  • Denoising Strength
    • This property determines how much the texture generated during the upscaling process is allowed to change in content. Closer to 0, there will be no change, while closer to 1, the content of the texture will change significantly.
  • Hires steps
    • This property sets how many times the texture upscaling is repeated. Higher values will result in better quality, but may result in longer generation times.

Seed

A unique number that plays an important role in texture generation.

Using the same prompt, settings, and Seed value will always generate the same image. The default value is -1, which generates a randomized image each time.

  • Shuffle
    • Button to set the Seed value to random. When Click, the Seed value is automatically set to -1.
  • Last Seed
    • This function allows you to retrieve the Seed value of the most recently generated Texture.

Generate Forever

If checked to enable, this property will continue to generate textures until unchecked.

Generate

Button to proceed with texture generation.

Generate(Rect)

This button starts texture generation for a rectangular area created by Click and Drag in the scene.

Custom Cameras

This property is used to generate textures based on pre-placed cameras. Select the desired camera by Click the slot next to the property, and that camera will be used as the basis.

  • R : Button to reset all set cameras.
  • +: Button to add a camera slot.
  • - : Button to delete a camera slot. The lowest camera will be deleted first.
  • Generate (Custom Camera)
    • This button generates textures in succession based on the number of cameras set up.
    • For example, if you have two cameras set up, the texture will be generated twice in a row.
  • Generate (All in One)
    • Unlike the Generate (Custom Camera) button, this button generates one texture at a time, taking into account the orientation of all set cameras.
    • If the cameras are positioned so that the image areas of the model overlap, the generated textures may also overlap, so be careful.

Result

An at-a-glance view of the generated textures and the various features associated with them.

  • Rescan
    • A button that brings up all of the textures generated for the currently active model.
  • Rescan All
    • This button loads all textures created within the current project in bulk.
  • Apply Texture
    • This button applies the selected texture to the model and displays it on the screen.
  • Restore Saved Camera Transform
    • This button moves and rotates the scene to the camera view when the selected resulting texture was generated.

PromptHistory

Allows you to view the history of prompts used. You can bring up the contents of the Prompt and NegativePrompt by Click the Apply button next to the desired prompt history.