Diffusers documentation

Activation functions

You are viewing main version, which requires installation from source. If you'd like regular pip install, checkout the latest stable version (v0.31.0).
Hugging Face's logo
Join the Hugging Face community

and get access to the augmented documentation experience

to get started

Activation functions

Customized activation functions for supporting various models in 🤗 Diffusers.

GELU

class diffusers.models.activations.GELU

< >

( dim_in: int dim_out: int approximate: str = 'none' bias: bool = True )

Parameters

  • dim_in (int) — The number of channels in the input.
  • dim_out (int) — The number of channels in the output.
  • approximate (str, optional, defaults to "none") — If "tanh", use tanh approximation.
  • bias (bool, defaults to True) — Whether to use a bias in the linear layer.

GELU activation function with tanh approximation support with approximate="tanh".

GEGLU

class diffusers.models.activations.GEGLU

< >

( dim_in: int dim_out: int bias: bool = True )

Parameters

  • dim_in (int) — The number of channels in the input.
  • dim_out (int) — The number of channels in the output.
  • bias (bool, defaults to True) — Whether to use a bias in the linear layer.

A variant of the gated linear unit activation function.

ApproximateGELU

class diffusers.models.activations.ApproximateGELU

< >

( dim_in: int dim_out: int bias: bool = True )

Parameters

  • dim_in (int) — The number of channels in the input.
  • dim_out (int) — The number of channels in the output.
  • bias (bool, defaults to True) — Whether to use a bias in the linear layer.

The approximate form of the Gaussian Error Linear Unit (GELU). For more details, see section 2 of this paper.

< > Update on GitHub