A render target is a buffer where the video card draws pixels for a scene that is being rendered in the background. It is used in different effects, such as applying postprocessing to a rendered image before displaying it on the screen.
1. See Texture.anisotropy.
Creates a new WebGLRenderTarget
Read-only flag to check if a given object is of type WebGLRenderTarget.
The width of the render target.
The height of the render target.
A rectangular area inside the render target's viewport. Fragments that are outside the area will be discarded.
Indicates whether the scissor test is active or not.
The viewport of this render target.
This texture instance holds the rendered pixels. Use it as input for further processing.
Renders to the depth buffer. Default is true.
Renders to the stencil buffer. Default is false.
If set, the scene depth will be rendered to this texture. Default is null.
Defines the count of MSAA samples. Can only be used with WebGL 2. Default is
Sets the size of the render target.
Creates a copy of this render target.
Adopts the settings of the given render target.
Frees the GPU-related resources allocated by this instance. Call this method whenever this instance is no longer used in your app.
EventDispatcher methods are available on this class.
For more info on how to obtain the source code of this module see this page.