This works as it can take a webcam feed, so it must be able to pull a stream in from a video url.
Any one have any tips for taking this shadertoy code and implimenting it with Verge3D?
I don’t just want to use the shader as a material. I kinda need whats going on in the shader to be going on in my blender/verge3D scene.
The example provided may not be a video, but the shaders are emitting light in the scene, lighting up the character and the warehouse walls. I want to do the same, but with video instead of a shader.
The video im using as a texture is pulled from the web, its not costly at all as far as i’m aware. a press play image is clicked and then replaced with a video pulled from online.
Will lightprobes work with video? Doesn’t it have to be baked in? So, it would only work with a still image? It wouldn’t work with video pulled from the web, nor would it work with image textures in the uploaded file because it can’t update the lightprobe for each frame, unless i’m mistaken?
Emissive only reacts with bloom, so it’s not going to give me any kind of light casting across other objects in the scene.