Making 3D web apps with Blender and Three.js

From Verge3D Wiki
Revision as of 08:00, 24 November 2021 by Alexander (talk | contribs) (Created page with "Blender is the most popular open-source 3D modelling suite while Three.js is the most popular WebGL toolkit. Both have millions of users, however rarely used together to make...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigationJump to search

Blender is the most popular open-source 3D modelling suite while Three.js is the most popular WebGL toolkit. Both have millions of users, however rarely used together to make something that's totally amazing - interactive 3D visualizations that work on the Web!

So, let's discuss how to make fancy 3D web interactives based on your Blender scenes. In general, you have two options:

  1. Use the glTF exporter that comes with Blender and then create a new Three.js application from scratch.
  2. Use an upgraded version of Three.js called Verge3D to make the same in a code-less manner.

Approach 1: glTF + Coding

Let's say you have Blender installed. If not, please refer to the following page — Blender Download.

With Three.js, it's not so easy! The official Three.js Installation guide recommends using the NPM package manager, as well as the JS module bundling tool called webpack. Also, you're gonna need to run a local web server to test your apps. And the last but not least, you will have to use command-line interpreter to operate these 3 tools.

This set of tools looks familiar for seasoned web developers, however using them to run Three.js can be quite non-trivial, especially for people who have limited coding skills (e.g for Blender artists who have some experience in Python scripting).

Another approach is just include static Three.js build into your project directory, add some external dependencies (such as glTF 2.0 loader), write basic HTML page to launch all these and then run Python-based web server to host the app. The good thing — Python is already included in Blender!

To simplify things even more, download and unpack Three.js-Blender template we've already prepared for you.

So, let's discuss steps to produce and run your first Three.js app.

Creating Blender scene

This one is pretty straightforward, considering you know limitations of the glTF 2.0 format and 3D web graphics. Among them:

  • Make your models low-poly so the complexity of your entire scene does not exceed several hundreds of thousands polygons. 100K-500K polys is good enough.
  • Cameras are supported, however you should explicitly assign them in your app via JavaScript.
  • Lights and world shaders nodes are not supported. We're going to use JavaScript to setup lighting.
  • Your materials should be based on a single Principled BSDF node. Refer to the Blender Manual for more info.

Suzanne blender.jpg

  • Limited set of textures supported: Color, Metallic, Roughness, AO, Normal Map, Emissive.
  • In general, only two UV maps supported. Metallic and Roughness textures should use the same UV map.
  • You can assign offset, rotation and scale for your textures via single Mapping node. All other UV modifications are not supported.
  • You can animate limited set of params: object position, rotation, scale, influence value for shape keys, and bone transformations. Animations should be assigned and controlled via JavaScript.

Exporting scene to glTF 2.0

First, open Blender preferences, click Add-ons, then make sure that Import-Export: glTF 2.0 format addon is enabled:

Blender stock gltf plugin.jpg

After that you can export your scene to glTF via File -> Export -> glTF 2.0 (.glb/.gltf) menu.

Copying Three.js builds and external dependencies

You can download pre-built version of Three.js on GitHub. Select latest release then scroll down to download *.zip(Windows) or tar.gz (macOS, Linux) version. In our first app we're going to use the following files from that archive:

  • three.module.js — base Three.js module
  • GLTFLoader.js — loader for our glTF files.
  • OrbitControls.js — to rotate our camera with mouse/touchscreen.
  • RGBELoader.js — to load fancy HDRI map used as environment lighting.

Creating main HTML file with your app

Create file index.html with the following content. For simplicity we will include HTML, CSS, and JavaScript code there:

<!DOCTYPE html>
<html lang="en">
    <title>Blender-to-Three.js App Template</title>
    <meta charset="utf-8">
    <meta name="viewport" content="width=device-width, user-scalable=no, minimum-scale=1.0, maximum-scale=1.0">
      body {
        margin: 0px;

    <script type="module">

      import * as THREE from './three.module.js';

      import { OrbitControls } from './OrbitControls.js';
      import { GLTFLoader } from './GLTFLoader.js';
      import { RGBELoader } from './RGBELoader.js';

      let camera, scene, renderer;


      function init() {

        const container = document.createElement( 'div' );
        document.body.appendChild( container );

        camera = new THREE.PerspectiveCamera( 45, window.innerWidth / window.innerHeight, 0.25, 20 );
        camera.position.set( - 1.8, 0.6, 2.7 );

        scene = new THREE.Scene();

        new RGBELoader()
          .load( 'environment.hdr', function ( texture ) {

            texture.mapping = THREE.EquirectangularReflectionMapping;

            scene.background = texture;
            scene.environment = texture;


            // model

            const loader = new GLTFLoader();
            loader.load( 'suzanne.gltf', function ( gltf ) {

              scene.add( gltf.scene );


            } );

          } );

        renderer = new THREE.WebGLRenderer( { antialias: true } );
        renderer.setPixelRatio( window.devicePixelRatio );
        renderer.setSize( window.innerWidth, window.innerHeight );
        renderer.toneMapping = THREE.ACESFilmicToneMapping;
        renderer.toneMappingExposure = 1;
        renderer.outputEncoding = THREE.sRGBEncoding;
        container.appendChild( renderer.domElement );

        const controls = new OrbitControls( camera, renderer.domElement );
        controls.addEventListener( 'change', render ); // use if there is no animation loop
        controls.minDistance = 2;
        controls.maxDistance = 10; 0, 0, - 0.2 );

        window.addEventListener( 'resize', onWindowResize );


      function onWindowResize() {

        camera.aspect = window.innerWidth / window.innerHeight;

        renderer.setSize( window.innerWidth, window.innerHeight );




      function render() {

        renderer.render( scene, camera );




If you want to write any code, just take this file from the template we mentioned earlier.

What we do in this HTML snippet?

  1. Initializing Three.js: preparing canvas, scene/camera as well as WebGL renderer.
  2. Creating fancy "orbit" camera controls by using the external OrbitControls.js script.
  3. Loading HDR map to achieve decent quality of the scene lighting.
  4. Final step is loading glTF 2.0 model we exported previously.

If you run this HTML file in your file manager you'll get .... nothing! Without a properly configured web server you won't be able to launch this web application. Since this is a security measure that imposed by browsers, there is no easy way to overcome this restriction. So let's write some server!

Running web server

Let's create (or just copy the ready-to-use file from the template) basic HTTP server script with the following content:

import http.server
import os
import bpy

from threading import Thread, current_thread
from functools import partial

def ServeDirectoryWithHTTP(directory='.'):
    hostname = 'localhost'
    port = 8000
    directory = os.path.abspath(directory)
    handler = partial(http.server.SimpleHTTPRequestHandler, directory=directory)
    httpd = http.server.HTTPServer((hostname, port), handler, False)
    httpd.allow_reuse_address = True


    def serve_forever(httpd):
        with httpd:

    thread = Thread(target=serve_forever, args=(httpd, ))

app_root = os.path.dirname(bpy.context.space_data.text.filepath)

Save it under name, then go to Blender, switch to Text Editor area:

Blender open text block.jpg

Click Open, find and open your server script, then press right arrow icon to launch the HTTP server:

Blender run script.jpg

The server should run in the background until you close Blender.

Running the app

Open the following page in your browser http://localhost:8000/. You should see the following page:

Three app.jpg

Try to rotate or zoom the model. There are some issues with its look, basically it does not look exactly like in Blender viewport. There are some measures how this situation can be improved by won't expect much: Three.js is still Three.js and Blender is still Blender.

Organizing pipeline

Here is a typical approach in creating apps with Three.js and Blender:

  1. Use template to deploy your apps quickly. Feel free to take our template - it's free and open-source.
  1. It's much easier to just load and apply a single HDR-based environment lighting than tweaking multiple light sources with JavaScript.
  1. Blender first, Three.js second. Basically, try to design as much as possible in Blender to reduce amount of complex graphics coding.
  1. Take some time to learn glTF 2.0, including its limitations and constraints.

Approach 2: Verge3D

Verge3D is a kind of Three.js on steroids. It includes various convenient tools to simplify creating 3D interactive content from Blender scenes. The idea behind it quite simple — 3D web apps should be designed by artists, not coders.

The tool cost some money, however you can use the full-featured trial version as long as you wish. Installation is simple: download installer (Windows) or zip archive (macOS, Linux), then follow the installation procedure by enabling the Blender add-on that shipped with Verge3D:

Blender verge3d plugin.jpg

Verge3D comes with a handy feature called Sneak Peek:

Blender sneak peek.jpg

By clicking on this "magic" button you can export and preview your Blender scene right in the web browser:

V3d app.jpg

Basically you won't need to create any projects, write any JavaScript logic or launch a web server. It's all done by Verge3D.

If you compare the rendering you get in the browser with Blender viewport you won't see much difference (switch to real-time Eevee renderer to get better results). Verge3D tries to reproduce many Blender features including native node-based materials, lighting, shadows, animation, morphing, etc. Like with vanilla Three.js, it exports to the glTF 2.0 asset and load it in the browser via JavaScript... except you don't need to write any code by yourself.

So how to make your app interactive, if not by JavaScript logic? For that Verge3D comes with visual scripting environment called Puzzles. For example, to convert default Blender cube to nice and spinning Utah teapot you can write the following "code":

Puzzles example.jpg

This logic is quite self-explanatory. Behind the scene, these blocks are converted to JavaScript which in turn trigger Verge3D APIs. If you need it, you can write your own JavaScript logic or create your own visual blocks. Also, Verge3D is compatible with Three.js, so you can use Three.js snippets, examples, or apps found on the web.

Verge3D comes with many useful features, including App Manager, AR/VR, WordPress integration, Cordova/Electron builders, physics engine, tons of plugins, materials, and asset packs, as well as ready-to-use solutions for e-commerce and e-learning industries. Discussing all these is beyond the scope of this article, so refer to the beginner-level tutorial series to learn more about this toolkit.

Which approach is better?

Three.js has one feature that is quite convincing — its free. Basically, if Blender is free, Three.js is free, and your time is free you are just great! You can make something really big, share it, get some recognition, or even sell your work without paying a buck!

On the other side, if paying some money for a license is not a big deal — you better stick to Verge3D. Not only because its more powerful yet compatible with Three.js. Verge3D is designed to be an artist-friendly tool, so it will be very helpful if you already use or going to use Blender in your development pipeline.