Difference between revisions of "Making 3D web apps with Blender and Three.js"

From Verge3D Wiki
Jump to navigationJump to search
(Created page with "Blender is the most popular open-source 3D modelling suite while Three.js is the most popular WebGL toolkit. Both have millions of users, however rarely used together to make...")
 
 
(35 intermediate revisions by 2 users not shown)
Line 1: Line 1:
Blender is the most popular open-source 3D modelling suite while Three.js is the most popular WebGL toolkit. Both have millions of users, however rarely used together to make something that's totally amazing - interactive 3D visualizations that work on the Web!
[[Category:Tutorials]][[Category:Blender]]{{#seo:
|description=This tutorial discusses how to use Blender and Three.js to create interactive 3D web applications.
|keywords=Three.js, Blender, 3D, WebGL, 3DWeb, Web3D
|image=blender_plus_threejs.png
}}
[[File:blender_plus_threejs.png|right|thumb]]
It is well known that Blender is the most popular open-source 3D modeling suite. On the other hand, Three.js is the most popular WebGL library. Both have millions of users on their own, yet rarely used in combo for making interactive 3D visualizations that work on the Web. This is because such projects require very different sets of skills to cooperate.


So, let's discuss how to make fancy 3D web interactives based on your Blender scenes. In general, you have two options:
Here we are discussing ways to overcome these difficulties. So, how can we make fancy 3D web interactives based on Blender scenes? Basically, you got two options:


# Use the glTF exporter that comes with Blender and then create a new Three.js application from scratch.
# Use the glTF exporter, that comes with Blender, then code a Three.js application to load the scene into it.
# Use an upgraded version of Three.js called Verge3D to make the same in a code-less manner.
# Use a framework to provide that integration without coding, such as the upgraded version of Three.js called Verge3D.


== Approach 1: glTF + Coding ==
This tutorial mostly covers the vanilla Three.js approach. In the end, some info about Verge3D will be provided as well.


Let's say you have Blender installed. If not, please refer to the following page — [https://www.blender.org/download/ Blender Download].
== Approach #1: Vanilla Three.js ==


With Three.js, it's not so easy! The official [https://threejs.org/docs/#manual/en/introduction/Installation Three.js Installation] guide recommends using the NPM package manager, as well as the JS module bundling tool called webpack. Also, you're gonna need to run a local web server to test your apps. And the last but not least, you will have to use command-line interpreter to operate these 3 tools.
=== Intro ===


This set of tools looks familiar for seasoned web developers, however using them to run Three.js can be quite non-trivial, especially for people who have limited coding skills (e.g for Blender artists who have some experience in Python scripting).
We suppose you already have Blender installed. If not, you can get from [https://www.blender.org/download/ here] and just run the installer.


Another approach is just include static Three.js build into your project directory, add some external dependencies (such as glTF 2.0 loader), write basic HTML page to launch all these and then run Python-based web server to host the app. The good thing — Python is already included in Blender!
With Three.js however, it's not that easy! The official Three.js [https://threejs.org/docs/#manual/en/introduction/Installation installation guide] recommends using the NPM package manager, as well as the JS module bundling tool called ''webpack''. Also, you're gonna need to run a local web server to test your apps. You will have to use the command-line interpreter to operate all these. This tool chain looks familiar to seasoned web developers. Still, using it with Three.js can turn out to be quite non-trivial, especially for people with little or no coding skills (e.g. for Blender artists who might even have some experience in Python scripting).


To simplify things even more, [https://github.com/Soft8Soft/threejs-blender-template download] and unpack Three.js-Blender template we've already prepared for you.
An easier way is to just copy the static Three.js build into your project folder, add some external dependencies (such as the glTF 2.0 loader), compose a basic HTML page to link all these, and finally run some web server to open the app. Even better — Blender has Python, so you can just launch the standard web server that comes with Python.  


So, let's discuss steps to produce and run your first Three.js app.
This latter approach is exactly what we are discussing in this tutorial. If you don't want to reproduce it step by step for yourself, you can proceed to downloading the complete [https://github.com/Soft8Soft/threejs-blender-template Three.js-Blender starter project]. Ok, let's begin!


=== Creating Blender scene ===
=== Creating the Blender scene ===


This one is pretty straightforward, considering you know limitations of the glTF 2.0 format and 3D web graphics. Among them:
This step seems to be pretty straightforward for a Blender artist, right? Well, you must get familiar with the limitations of the glTF 2.0 format, and WebGL in general! Particularly:


* Make your models low-poly so the complexity of your entire scene does not exceed several hundreds of thousands polygons. 100K-500K polys is good enough.
* Models must be low-poly to middle-poly so that the entire scene does not exceed several hundreds of thousands polygons. Usually 100k-500k polys is fine.


* Cameras are supported, however you should explicitly assign them in your app via JavaScript.
* Cameras should be assigned explicitly in your app via JavaScript.


* Lights and world shaders nodes are not supported. We're going to use JavaScript to setup lighting.
* Lights and world shader nodes are not supported. Again, you'll need to use JavaScript to setup proper lighting.


* Your materials should be based on a single '''Principled BSDF''' node. Refer to the [https://docs.blender.org/manual/en/2.80/addons/io_scene_gltf2.html#materials Blender Manual] for more info.
* Materials should be based on a single '''Principled BSDF''' node. Refer to the [https://docs.blender.org/manual/en/2.80/addons/io_scene_gltf2.html#materials Blender Manual] for more info.


[[File:suzanne_blender.jpg|1000px]]
[[File:suzanne_blender.jpg|1000px]]


* Limited set of textures supported: Color, Metallic, Roughness, AO, Normal Map, Emissive.
* Only limited set of textures is supported, namely Color, Metallic, Roughness, AO, Normal Map, and Emissive.


* In general, only two UV maps supported. Metallic and Roughness textures should use the same UV map.
* Maximum two UV maps are supported. Metallic and Roughness textures should use the same UV map.


* You can assign offset, rotation and scale for your textures via [https://docs.blender.org/manual/en/2.80/addons/io_scene_gltf2.html#uv-mapping single Mapping node]. All other UV modifications are not supported.
* You can adjust the offset, rotation and scale for your textures via a [https://docs.blender.org/manual/en/2.80/addons/io_scene_gltf2.html#uv-mapping single Mapping node]. All other UV modifications are not supported.


* You can animate limited set of params: object position, rotation, scale, influence value for shape keys, and bone transformations. Animations should be assigned and controlled via JavaScript.
* You can only animate a limited set of params: object position, rotation, scale, the influence value for shape keys, and bone transformations. Again, all animations are initialized and controlled with JavaScript.


=== Exporting scene to glTF 2.0 ===
* And so on. Other things come on the need-to-know basis.


First, open Blender preferences, click '''Add-ons''', then make sure that '''Import-Export: glTF 2.0 format''' addon is enabled:
=== Exporting the scene to glTF 2.0 ===
 
The procedure is as follows. First, you open Blender preferences, click '''Add-ons''', then make sure that '''Import-Export: glTF 2.0 format''' addon is enabled:


[[File:blender_stock_gltf_plugin.jpg|708px]]
[[File:blender_stock_gltf_plugin.jpg|708px]]


After that you can export your scene to glTF via '''File -> Export -> glTF 2.0 (.glb/.gltf)''' menu.
After that you can export your scene to glTF via '''File -> Export -> glTF 2.0 (.glb/.gltf)''' menu. That's it.


=== Copying Three.js builds and external dependencies ===
=== Adding Three.js builds and dependencies ===


You can download pre-built version of Three.js on [https://github.com/mrdoob/three.js/ GitHub]. Select latest release then scroll down to download '''*.zip'''(Windows) or '''tar.gz''' (macOS, Linux) version. In our first app we're going to use the following files from that archive:
You can download the pre-built version of Three.js from [https://github.com/mrdoob/three.js/ GitHub]. Select the latest release, then scroll down to download '''*.zip'''(Windows) or '''tar.gz''' (macOS, Linux) version. In our first app we're going to use the following files from that archive:


* three.module.js — base Three.js module
* three.module.js — base Three.js module
Line 59: Line 67:
* RGBELoader.js — to load fancy HDRI map used as environment lighting.
* RGBELoader.js — to load fancy HDRI map used as environment lighting.


=== Creating main HTML file with your app ===
=== Creating the main HTML file ===


Create file '''index.html''' with the following content. For simplicity we will include HTML, CSS, and JavaScript code there:
Create the file '''index.html''' with the following content. For simplicity, it includes everything (HTML, CSS, and JavaScript):


<syntaxhighlight lang="html">
<syntaxhighlight lang="html">
Line 168: Line 176:
</syntaxhighlight>
</syntaxhighlight>


If you want to write any code, just take this file from the template we mentioned earlier.
You can find this file in the starter project on GitHub (linked above). So, what is going on in this code snippet? Well...


What we do in this HTML snippet?
# Initializing the canvas, scene and camera, as well as WebGL renderer.
# Creating "orbit" camera controls by using the external OrbitControls.js script.
# Loading an HDR map for image-based lighting.
# Finally, loading the glTF 2.0 model we exported previously.


# Initializing Three.js: preparing canvas, scene/camera as well as WebGL renderer.
If you merely open this HTML file with the browser you'll see.... nothing! This is a security measure that is imposed by the browsers. Without a properly configured web server you won't be able to launch this web application. So let's run a server!
# Creating fancy "orbit" camera controls by using the external OrbitControls.js script.
# Loading HDR map to achieve decent quality of the scene lighting.
# Final step is loading glTF 2.0 model we exported previously.


If you run this HTML file in your file manager you'll get .... nothing! Without a properly configured web server you won't be able to launch this web application. Since this is a security measure that imposed by browsers, there is no easy way to overcome this restriction. So let's write some server!
=== Running a web server ===


=== Running web server ===
If we have Python in Blender, why not use it? Let's create a basic Python script with the following content (or just copy the file from the starter project):
 
Let's create (or just copy the ready-to-use file from the template) basic HTTP server script with the following content:


<syntaxhighlight lang="python">
<syntaxhighlight lang="python">
Line 214: Line 220:
</syntaxhighlight>
</syntaxhighlight>


Save it under '''server.py''' name, then go to Blender, switch to '''Text Editor''' area:
Save it as '''server.py'''. In Blender, switch to the '''Text Editor''' area:


[[File:blender_open_text_block.jpg|657px]]
[[File:blender_open_text_block.jpg|657px]]


Click '''Open''', find and open your server script, then press right arrow icon to launch the HTTP server:
Click '''Open''', locate and open that script, then click the 'Play' icon to launch the server:


[[File:blender_run_script.jpg|843px]]
[[File:blender_run_script.jpg|843px]]


The server should run in the background until you close Blender.
This server will be running in the background until you close Blender. Cool, yeah?


=== Running the app ===
=== Running the app ===
Line 230: Line 236:
[[File:three_app.jpg|1000px]]
[[File:three_app.jpg|1000px]]


Try to rotate or zoom the model. There are some issues with its look, basically it does not look exactly like in Blender viewport. There are some measures how this situation can be improved by won't expect much: Three.js is still Three.js and Blender is still Blender.
Try to rotate or zoom the model. Surely there are some issues with the rendering: namely it does not look exactly as in Blender viewport. There are some things we can do to improve this situation but don't expect much: Three.js is Three.js and Blender is Blender.
 
=== Organizing pipeline ===
 
Here is a typical approach in creating apps with Three.js and Blender:
 
# Use template to deploy your apps quickly. Feel free to take our template - it's free and open-source.


# It's much easier to just load and apply a single HDR-based environment lighting than tweaking multiple light sources with JavaScript.
=== Building the pipeline ===


# Blender first, Three.js second. Basically, try to design as much as possible in Blender to reduce amount of complex graphics coding.
A typical approach for creating web apps with Three.js and Blender is recommended below:


# Take some time to learn glTF 2.0, including its limitations and constraints.
# Use the starter project to quickly create and deploy your apps. Feel free to upgrade that template - it's all open-source.
# It's much easier to load a single HDR map for environment lighting, than to tweak multiple light sources with JavaScript. And it works faster!
# Blender comes first, Three.js goes second. This means you should try to design as much as possible in Blender. This will save you a great deal of coding.
# Take some time to learn more about glTF 2.0, so that you fully understand the limitations and constraints of this format.
# For anything beyond loading a mere model you can refer to myriad of [https://threejs.org/examples/#webgl_animation_keyframes Three.js examples].


== Approach 2: Verge3D ==
== Approach #2: Verge3D ==


Verge3D is a kind of Three.js on steroids. It includes various convenient tools to simplify creating 3D interactive content from Blender scenes. The idea behind it quite simple — 3D web apps should be designed by artists, not coders.
Verge3D is kinda Three.js on steroids. It includes a bunch of tools to sweeten the creation of 3D web content based on Blender scenes. As a result, this toolkit puts artists, instead of programmers, in charge of a project.


The tool cost some money, however you can use the full-featured trial version as long as you wish. Installation is simple: [https://www.soft8soft.com/get-verge3d download] installer (Windows) or zip archive (macOS, Linux), then follow the installation procedure by enabling the Blender add-on that shipped with Verge3D:
The toolkit costs some money, but you can experiment with a trial version the only limitation of which is that shows a watermark. Installation is fairly simple: [https://www.soft8soft.com/get-verge3d get] the installer (Windows) or zip archive (macOS, Linux), then enable the Blender add-on that is shipped with Verge3D:


[[File:blender_verge3d_plugin.jpg|708px]]
[[File:blender_verge3d_plugin.jpg|708px]]


Verge3D comes with a handy feature called '''Sneak Peek''':
Verge3D comes with a convenient preview feature called '''Sneak Peek''':


[[File:blender_sneak_peek.jpg|929px]]
[[File:blender_sneak_peek.jpg|929px]]


By clicking on this "magic" button you can export and preview your Blender scene right in the web browser:
This magic button exports the Blender scene in a temporary location and immediately opens it in the web browser:


[[File:v3d_app.jpg|1000px]]
[[File:v3d_app.jpg|1000px]]


Basically you won't need to create any projects, write any JavaScript logic or launch a web server. It's all done by Verge3D.
There is no need to create a project from scratch, write any JavaScript, or care about the web server — Verge3D does this all for you.


If you compare the rendering you get in the browser with Blender viewport you won't see much difference (switch to real-time Eevee renderer to get better results). Verge3D tries to reproduce many Blender features including native node-based materials, lighting, shadows, animation, morphing, etc. Like with vanilla Three.js, it exports to the glTF 2.0 asset and load it in the browser via JavaScript... except you don't need to write any code by yourself.
The WebGL rendering is usually consistent with Blender viewport (especially if you switch to the real-time renderer Eevee). This is because Verge3D tries to accurately reproduce most Blender features, such as native node-based materials, lighting, shadows, animation, morphing, etc. It works similar to vanilla Three.js, i.e. it first exports the scene to a glTF 2.0 asset, and then loads it in the browser via JavaScript... except you don't need to write any code.


So how to make your app interactive, if not by JavaScript logic? For that Verge3D comes with visual scripting environment called Puzzles. For example, to convert default Blender cube to nice and spinning Utah teapot you can write the following "code":
But how can we make the app interactive, if not by JavaScript code? Verge3D does it differently as it comes with a Scratch-like scripting environment called Puzzles. For example, to convert the default Blender cube to a nice spinning Utah teapot you can employ the following "code":


[[File:Puzzles_example.jpg|907px]]
[[File:Puzzles_example.jpg|907px]]


This logic is quite self-explanatory. Behind the scene, these blocks are converted to JavaScript which in turn trigger Verge3D APIs. If you need it, you can write your own JavaScript logic or create your own visual blocks. Also, Verge3D is compatible with Three.js, so you can use Three.js snippets, examples, or apps found on the web.
This logic is quite self-explanatory. Under the hood, these blocks are converted to JavaScript that calls Verge3D APIs. You still can write your own JavaScript, or even add new visual blocks as plugins. In any case, Verge3D remains compatible with Three.js, so you can use Three.js snippets, examples, or apps found on the web.


Verge3D comes with many useful features, including App Manager, AR/VR, WordPress integration, Cordova/Electron builders, physics engine, tons of plugins, materials, and asset packs, as well as ready-to-use solutions for e-commerce and e-learning industries. Discussing all these is beyond the scope of this article, so refer to the [https://youtu.be/KIq2Q-DFCT0 beginner-level tutorial series] to learn more about this toolkit.
Verge3D comes with many other useful features, including the App Manager, AR/VR integration, WordPress plugin, Cordova/Electron builders, a physics engine, tons of plugins, materials, and asset packs, as well as ready-to-use solutions for e-commerce and e-learning industries. Discussing all these is beyond the scope of this article, so refer to these [https://youtu.be/KIq2Q-DFCT0 beginner-level tutorial series] to learn more about this toolkit.


== Which approach is better? ==
== Which approach to choose? ==


Three.js has one feature that is quite convincing — its free. Basically, if Blender is free, Three.js is free, and your time is free you are just great! You can make something really big, share it, get some recognition, or even sell your work without paying a buck!
Three.js offers a feature that is quite convincing — it's free! Blender is free too, and if you got plenty of time, you are all set! You can make something really big, share it, get some recognition, or sell your work without paying a buck first.


On the other side, if [https://www.soft8soft.com/licensing/ paying some money] for a license is not a big deal you better stick to Verge3D. Not only because its more powerful yet compatible with Three.js. Verge3D is designed to be an artist-friendly tool, so it will be very helpful if you already use or going to use Blender in your development pipeline.
On the other hand, if paying some money for a [https://www.soft8soft.com/licensing/ license] is not a big deal for you (or your company), you might consider Verge3D. And this is not only because it's a more powerful variant of Three.js that will help you meet a deadline. Verge3D is designed to be an artist-friendly tool that is worth looking into if you're using Blender in your pipeline.

Latest revision as of 13:44, 1 September 2022

Blender plus threejs.png

It is well known that Blender is the most popular open-source 3D modeling suite. On the other hand, Three.js is the most popular WebGL library. Both have millions of users on their own, yet rarely used in combo for making interactive 3D visualizations that work on the Web. This is because such projects require very different sets of skills to cooperate.

Here we are discussing ways to overcome these difficulties. So, how can we make fancy 3D web interactives based on Blender scenes? Basically, you got two options:

  1. Use the glTF exporter, that comes with Blender, then code a Three.js application to load the scene into it.
  2. Use a framework to provide that integration without coding, such as the upgraded version of Three.js called Verge3D.

This tutorial mostly covers the vanilla Three.js approach. In the end, some info about Verge3D will be provided as well.

Approach #1: Vanilla Three.js

Intro

We suppose you already have Blender installed. If not, you can get from here and just run the installer.

With Three.js however, it's not that easy! The official Three.js installation guide recommends using the NPM package manager, as well as the JS module bundling tool called webpack. Also, you're gonna need to run a local web server to test your apps. You will have to use the command-line interpreter to operate all these. This tool chain looks familiar to seasoned web developers. Still, using it with Three.js can turn out to be quite non-trivial, especially for people with little or no coding skills (e.g. for Blender artists who might even have some experience in Python scripting).

An easier way is to just copy the static Three.js build into your project folder, add some external dependencies (such as the glTF 2.0 loader), compose a basic HTML page to link all these, and finally run some web server to open the app. Even better — Blender has Python, so you can just launch the standard web server that comes with Python.

This latter approach is exactly what we are discussing in this tutorial. If you don't want to reproduce it step by step for yourself, you can proceed to downloading the complete Three.js-Blender starter project. Ok, let's begin!

Creating the Blender scene

This step seems to be pretty straightforward for a Blender artist, right? Well, you must get familiar with the limitations of the glTF 2.0 format, and WebGL in general! Particularly:

  • Models must be low-poly to middle-poly so that the entire scene does not exceed several hundreds of thousands polygons. Usually 100k-500k polys is fine.
  • Cameras should be assigned explicitly in your app via JavaScript.
  • Lights and world shader nodes are not supported. Again, you'll need to use JavaScript to setup proper lighting.
  • Materials should be based on a single Principled BSDF node. Refer to the Blender Manual for more info.

Suzanne blender.jpg

  • Only limited set of textures is supported, namely Color, Metallic, Roughness, AO, Normal Map, and Emissive.
  • Maximum two UV maps are supported. Metallic and Roughness textures should use the same UV map.
  • You can adjust the offset, rotation and scale for your textures via a single Mapping node. All other UV modifications are not supported.
  • You can only animate a limited set of params: object position, rotation, scale, the influence value for shape keys, and bone transformations. Again, all animations are initialized and controlled with JavaScript.
  • And so on. Other things come on the need-to-know basis.

Exporting the scene to glTF 2.0

The procedure is as follows. First, you open Blender preferences, click Add-ons, then make sure that Import-Export: glTF 2.0 format addon is enabled:

Blender stock gltf plugin.jpg

After that you can export your scene to glTF via File -> Export -> glTF 2.0 (.glb/.gltf) menu. That's it.

Adding Three.js builds and dependencies

You can download the pre-built version of Three.js from GitHub. Select the latest release, then scroll down to download *.zip(Windows) or tar.gz (macOS, Linux) version. In our first app we're going to use the following files from that archive:

  • three.module.js — base Three.js module
  • GLTFLoader.js — loader for our glTF files.
  • OrbitControls.js — to rotate our camera with mouse/touchscreen.
  • RGBELoader.js — to load fancy HDRI map used as environment lighting.

Creating the main HTML file

Create the file index.html with the following content. For simplicity, it includes everything (HTML, CSS, and JavaScript):

<!DOCTYPE html>
<html lang="en">
  <head>
    <title>Blender-to-Three.js App Template</title>
    <meta charset="utf-8">
    <meta name="viewport" content="width=device-width, user-scalable=no, minimum-scale=1.0, maximum-scale=1.0">
    <style>
      body {
        margin: 0px;
      }
    </style>
  </head>

  <body>
    <script type="module">

      import * as THREE from './three.module.js';

      import { OrbitControls } from './OrbitControls.js';
      import { GLTFLoader } from './GLTFLoader.js';
      import { RGBELoader } from './RGBELoader.js';

      let camera, scene, renderer;

      init();
      render();

      function init() {

        const container = document.createElement( 'div' );
        document.body.appendChild( container );

        camera = new THREE.PerspectiveCamera( 45, window.innerWidth / window.innerHeight, 0.25, 20 );
        camera.position.set( - 1.8, 0.6, 2.7 );

        scene = new THREE.Scene();

        new RGBELoader()
          .load( 'environment.hdr', function ( texture ) {

            texture.mapping = THREE.EquirectangularReflectionMapping;

            scene.background = texture;
            scene.environment = texture;

            render();

            // model

            const loader = new GLTFLoader();
            loader.load( 'suzanne.gltf', function ( gltf ) {

              scene.add( gltf.scene );

              render();

            } );

          } );

        renderer = new THREE.WebGLRenderer( { antialias: true } );
        renderer.setPixelRatio( window.devicePixelRatio );
        renderer.setSize( window.innerWidth, window.innerHeight );
        renderer.toneMapping = THREE.ACESFilmicToneMapping;
        renderer.toneMappingExposure = 1;
        renderer.outputEncoding = THREE.sRGBEncoding;
        container.appendChild( renderer.domElement );

        const controls = new OrbitControls( camera, renderer.domElement );
        controls.addEventListener( 'change', render ); // use if there is no animation loop
        controls.minDistance = 2;
        controls.maxDistance = 10;
        controls.target.set( 0, 0, - 0.2 );
        controls.update();

        window.addEventListener( 'resize', onWindowResize );

      }

      function onWindowResize() {

        camera.aspect = window.innerWidth / window.innerHeight;
        camera.updateProjectionMatrix();

        renderer.setSize( window.innerWidth, window.innerHeight );

        render();

      }

      //

      function render() {

        renderer.render( scene, camera );

      }

    </script>

  </body>
</html>

You can find this file in the starter project on GitHub (linked above). So, what is going on in this code snippet? Well...

  1. Initializing the canvas, scene and camera, as well as WebGL renderer.
  2. Creating "orbit" camera controls by using the external OrbitControls.js script.
  3. Loading an HDR map for image-based lighting.
  4. Finally, loading the glTF 2.0 model we exported previously.

If you merely open this HTML file with the browser you'll see.... nothing! This is a security measure that is imposed by the browsers. Without a properly configured web server you won't be able to launch this web application. So let's run a server!

Running a web server

If we have Python in Blender, why not use it? Let's create a basic Python script with the following content (or just copy the file from the starter project):

import http.server
import os
import bpy

from threading import Thread, current_thread
from functools import partial

def ServeDirectoryWithHTTP(directory='.'):
    hostname = 'localhost'
    port = 8000
    directory = os.path.abspath(directory)
    handler = partial(http.server.SimpleHTTPRequestHandler, directory=directory)
    httpd = http.server.HTTPServer((hostname, port), handler, False)
    httpd.allow_reuse_address = True

    httpd.server_bind()
    httpd.server_activate()

    def serve_forever(httpd):
        with httpd:
            httpd.serve_forever()

    thread = Thread(target=serve_forever, args=(httpd, ))
    thread.setDaemon(True)
    thread.start()

app_root = os.path.dirname(bpy.context.space_data.text.filepath)
ServeDirectoryWithHTTP(app_root)

Save it as server.py. In Blender, switch to the Text Editor area:

Blender open text block.jpg

Click Open, locate and open that script, then click the 'Play' icon to launch the server:

Blender run script.jpg

This server will be running in the background until you close Blender. Cool, yeah?

Running the app

Open the following page in your browser http://localhost:8000/. You should see the following page:

Three app.jpg

Try to rotate or zoom the model. Surely there are some issues with the rendering: namely it does not look exactly as in Blender viewport. There are some things we can do to improve this situation but don't expect much: Three.js is Three.js and Blender is Blender.

Building the pipeline

A typical approach for creating web apps with Three.js and Blender is recommended below:

  1. Use the starter project to quickly create and deploy your apps. Feel free to upgrade that template - it's all open-source.
  2. It's much easier to load a single HDR map for environment lighting, than to tweak multiple light sources with JavaScript. And it works faster!
  3. Blender comes first, Three.js goes second. This means you should try to design as much as possible in Blender. This will save you a great deal of coding.
  4. Take some time to learn more about glTF 2.0, so that you fully understand the limitations and constraints of this format.
  5. For anything beyond loading a mere model you can refer to myriad of Three.js examples.

Approach #2: Verge3D

Verge3D is kinda Three.js on steroids. It includes a bunch of tools to sweeten the creation of 3D web content based on Blender scenes. As a result, this toolkit puts artists, instead of programmers, in charge of a project.

The toolkit costs some money, but you can experiment with a trial version the only limitation of which is that shows a watermark. Installation is fairly simple: get the installer (Windows) or zip archive (macOS, Linux), then enable the Blender add-on that is shipped with Verge3D:

Blender verge3d plugin.jpg

Verge3D comes with a convenient preview feature called Sneak Peek:

Blender sneak peek.jpg

This magic button exports the Blender scene in a temporary location and immediately opens it in the web browser:

V3d app.jpg

There is no need to create a project from scratch, write any JavaScript, or care about the web server — Verge3D does this all for you.

The WebGL rendering is usually consistent with Blender viewport (especially if you switch to the real-time renderer Eevee). This is because Verge3D tries to accurately reproduce most Blender features, such as native node-based materials, lighting, shadows, animation, morphing, etc. It works similar to vanilla Three.js, i.e. it first exports the scene to a glTF 2.0 asset, and then loads it in the browser via JavaScript... except you don't need to write any code.

But how can we make the app interactive, if not by JavaScript code? Verge3D does it differently as it comes with a Scratch-like scripting environment called Puzzles. For example, to convert the default Blender cube to a nice spinning Utah teapot you can employ the following "code":

Puzzles example.jpg

This logic is quite self-explanatory. Under the hood, these blocks are converted to JavaScript that calls Verge3D APIs. You still can write your own JavaScript, or even add new visual blocks as plugins. In any case, Verge3D remains compatible with Three.js, so you can use Three.js snippets, examples, or apps found on the web.

Verge3D comes with many other useful features, including the App Manager, AR/VR integration, WordPress plugin, Cordova/Electron builders, a physics engine, tons of plugins, materials, and asset packs, as well as ready-to-use solutions for e-commerce and e-learning industries. Discussing all these is beyond the scope of this article, so refer to these beginner-level tutorial series to learn more about this toolkit.

Which approach to choose?

Three.js offers a feature that is quite convincing — it's free! Blender is free too, and if you got plenty of time, you are all set! You can make something really big, share it, get some recognition, or sell your work without paying a buck first.

On the other hand, if paying some money for a license is not a big deal for you (or your company), you might consider Verge3D. And this is not only because it's a more powerful variant of Three.js that will help you meet a deadline. Verge3D is designed to be an artist-friendly tool that is worth looking into if you're using Blender in your pipeline.