Rendering 3D Meshes
This guide walks through Zaplib's rendering API. We'll go through a few steps:
- Start with a JavaScript application, which extracts a geometry from an STL file and renders it using ThreeJS and GLSL shaders.
- Move our STL loading logic into Zaplib and communicate results using Web Workers.
- Render using Zaplib.
This guide assumes an understanding of JavaScript web development, basic 3D graphics, and writing GPU shaders using a shading language (such as GLSL/HLSL).
You can either follow this tutorial directly; creating the necessary files from scratch, or read the working incremental versions of each step, located in zaplib/examples/tutorial_3d_rendering/
. To start from scratch, copy zaplib/examples/tutorial_3d_rendering/step1
into a new directory at the top level of the zaplib
repository called tutorial_3d_rendering
.
Step 1: Rendering a mesh in ThreeJS
This guide starts with a working 3D visualization in JavaScript, which renders an example using the popular ThreeJS library. Let's take a look at our existing files.
Our index.html
looks like the following:
<head>
<meta charset="utf-8" />
</head>
<body style="margin: 0; overflow:hidden;">
<script type="module" src="index.js"></script>
<div id="root" style="height: 100%; width: 100%;"></div>
</body>
In it, we define a top level full-page div
with an id of root
. We load index.js
as well, reproduced below. Afterward, we'll go through its important pieces.
index.js
import * as THREE from 'https://cdn.skypack.dev/three@v0.135.0';
import { OrbitControls } from 'https://cdn.skypack.dev/three@v0.135.0/examples/jsm/controls/OrbitControls'
const loadSTLIntoGeometry = async (assetUrl) => {
const buffer = await fetch(assetUrl).then(r => r.arrayBuffer());
const data = new DataView(buffer);
const HEADER_LENGTH = 80;
const numTriangles = data.getUint32(HEADER_LENGTH, true);
const vertices = new Float32Array(numTriangles * 9);
const normals = new Float32Array(numTriangles * 9);
for (let i = 0; i < numTriangles; i++) {
const offset = HEADER_LENGTH + 4 + i * 50;
const normalX = data.getFloat32(offset, true);
const normalY = data.getFloat32(offset + 4, true);
const normalZ = data.getFloat32(offset + 8, true);
for (let j = i * 9, k = 0; k < 36; j += 3, k += 12) {
vertices[j] = data.getFloat32(offset + 12 + k, true);
vertices[j + 1] = data.getFloat32(offset + 16 + k, true);
vertices[j + 2] = data.getFloat32(offset + 20 + k, true);
normals[j] = normalX;
normals[j + 1] = normalY;
normals[j + 2] = normalZ;
}
}
const geometry = new THREE.BufferGeometry();
geometry.attributes.position = new THREE.BufferAttribute(vertices, 3);
geometry.attributes.normal = new THREE.BufferAttribute(normals, 3);
geometry.attributes.offset = new THREE.InstancedBufferAttribute(new Float32Array([-10, 0, 10]), 1);
geometry.attributes.color = new THREE.InstancedBufferAttribute(new Float32Array([1, 1, 0, 0, 1, 1, 1, 0, 1]), 3);
return geometry;
}
const material = new THREE.ShaderMaterial({
vertexShader: `
varying vec3 vPos;
varying vec3 vNormal;
varying vec3 vColor;
attribute float offset;
attribute vec3 color;
void main() {
vPos = position;
vNormal = normal;
vColor = color;
gl_Position = projectionMatrix * modelViewMatrix * vec4(vec3(position.x, position.y + offset, position.z),1.0);
}
`,
fragmentShader: `
varying vec3 vPos;
varying vec3 vNormal;
varying vec3 vColor;
void main() {
vec3 lightPosition = vec3(20.,0.,30.);
vec3 lightDirection = normalize(vPos.xyz - lightPosition);
gl_FragColor = vec4(clamp(dot(-lightDirection, vNormal), 0.0, 1.0) * vColor,1.0);
}
`,
});
const init = async () => {
const div = document.getElementById("root");
const scene = new THREE.Scene();
const camera = new THREE.PerspectiveCamera(40, 1, 0.1, 1000);
camera.position.set(0, -30, 30);
const renderer = new THREE.WebGLRenderer();
renderer.setPixelRatio(window.devicePixelRatio);
renderer.setSize(600, 600);
div.appendChild(renderer.domElement);
const controls = new OrbitControls(camera, renderer.domElement);
const render = () => {
renderer.render(scene, camera);
}
const geometry = await loadSTLIntoGeometry("/zaplib/examples/tutorial_3d_rendering/teapot.stl");
const mesh = new THREE.InstancedMesh(geometry, material, 3);
scene.add(mesh);
function animate() {
requestAnimationFrame(animate);
render();
}
animate();
}
init();
This renders ThreeJS to the root
div, which displays our 3D scene.
Let's focus on what is happening in the init
function.
First we have our ThreeJS boilerplating.
const div = document.getElementById("root");
const scene = new THREE.Scene();
const camera = new THREE.PerspectiveCamera(40, 1, 0.1, 1000);
camera.position.set(0, -30, 30);
const renderer = new THREE.WebGLRenderer();
renderer.setPixelRatio(window.devicePixelRatio);
renderer.setSize(600, 600);
div.appendChild(renderer.domElement);
const controls = new OrbitControls(camera, renderer.domElement);
const render = () => {
renderer.render(scene, camera);
}
This does the following:
- Defines a new 3D scene, rendering results to our supplied
div
. - Defines a camera using a perspective projection, a field of view of 40, and a near/far z-axis of 0.1 and 1000. These numbers are not specifically important, but they are the defaults in Zaplib's Viewport, which we'll see later.
- Sets up OrbitControls, which lets us pan and zoom around our scene easily. OrbitControls by default lets us use the left mouse button to rotate the camera around the origin (0,0,0) while maintaining camera distance, and the right mouse button to pan around the scene freely.
Defining geometry
const geometry = await loadSTLIntoGeometry("/zaplib/examples/tutorial_3d_rendering/teapot.stl");
const mesh = new THREE.InstancedMesh(geometry, material, 3);
scene.add(mesh);
This defines a ThreeJS InstancedMesh using a custom geometry and material. We supply an instance count of 3
, meaning that we will be rendering our model three times, with some custom properties per instance.
The geometry is loaded from a remote STL file using loadSTLIntoGeometry
, let's take a look at that.
const loadSTLIntoGeometry = async (assetUrl) => {
const buffer = await fetch(assetUrl).then(r => r.arrayBuffer());
const data = new DataView(buffer);
const HEADER_LENGTH = 80;
const numTriangles = data.getUint32(HEADER_LENGTH, true);
const vertices = new Float32Array(numTriangles * 9);
const normals = new Float32Array(numTriangles * 9);
for (let i = 0; i < numTriangles; i++) {
const offset = HEADER_LENGTH + 4 + i * 50;
const normalX = data.getFloat32(offset, true);
const normalY = data.getFloat32(offset + 4, true);
const normalZ = data.getFloat32(offset + 8, true);
for (let j = i * 9, k = 0; k < 36; j += 3, k += 12) {
vertices[j] = data.getFloat32(offset + 12 + k, true);
vertices[j + 1] = data.getFloat32(offset + 16 + k, true);
vertices[j + 2] = data.getFloat32(offset + 20 + k, true);
normals[j] = normalX;
normals[j + 1] = normalY;
normals[j + 2] = normalZ;
}
}
const geometry = new THREE.BufferGeometry();
geometry.attributes.position = new THREE.BufferAttribute(vertices, 3);
geometry.attributes.normal = new THREE.BufferAttribute(normals, 3);
geometry.attributes.offset = new THREE.InstancedBufferAttribute(new Float32Array([-10, 0, 10]), 1);
geometry.attributes.color = new THREE.InstancedBufferAttribute(new Float32Array([1, 1, 0, 0, 1, 1, 1, 0, 1]), 3);
return geometry;
}
Without going line by line here, the function does the following:
- Fetch a remote asset and load its result into an ArrayBuffer. We'll be rendering a Utah teapot - by default this will be available in
zaplib/examples/tutorial_3d_rendering/
. - Read through the buffer, extracting information for each triangle one by one. See the binary STL spec here for information about its structure. We load each vertex and its corresponding normal into Float32Arrays.
- Define a new ThreeJS BufferGeometry and create new attributes to define its shapes.
- The extracted position and normal data is loaded in as BufferAttributes.
- We give each instance a y-axis offset, represented as floats, and load it as InstanceBufferAttribute.
- We give each instance a color, represented as RGB values, and load it as an InstancedBufferAttribute.
Our mesh's material is specified using a ShaderMaterial, and aims to provide very basic lighting with a fixed point light.
The vertex shader saves our position and normal as varying parameters to be used in the fragment shader, and converting our position from world coordinates to screen coordinates. We apply our instance offset
value to get a final position.
varying vec3 vPos;
varying vec3 vNormal;
varying vec3 vColor;
attribute float offset;
attribute vec3 color;
void main() {
vPos = position;
vNormal = normal;
vColor = color;
gl_Position = projectionMatrix * modelViewMatrix * vec4(vec3(position.x, position.y + offset, position.z),1.0);
}
The fragment shader specifies a fixed light source and calculates pixel color by multiplying our instance color and light intensity, using the dot product of the light direction and normal vector. We clamp light intensity between 0 and 1.
varying vec3 vPos;
varying vec3 vNormal;
varying vec3 vColor;
void main() {
vec3 lightPosition = vec3(20.,0.,30.);
vec3 lightDirection = normalize(vPos.xyz - lightPosition);
gl_FragColor = vec4(clamp(dot(-lightDirection, vNormal), 0.0, 1.0) * vColor,1.0);
}
Great! Running the example, we see our 3D scene with rudimentary lighting and pan/zoom controls with the mouse. There is a delay between page loading and scene rendering, due to our STL extraction code.
Step 2: STL extraction in Rust
WebAssembly and Rust are most useful for expensive operations, so let's offload STL extraction from JavaScript there and look at the tradeoffs. At a high level, this means:
- performing STL extraction using a Web Worker and therefore parallel to our main thread
- performing our network request for the STL file in Rust
- communicating our result buffer back to JavaScript
As a reminder, a working example at the end of this step is available in zaplib/examples/tutorial_3d_rendering/step2
.
Instantiate a new Zaplib project
This structure is further explained in previous tutorials. We'll need:
- a
Cargo.toml
file with the Zaplib dependency.
[package]
name = "tutorial_3d_rendering"
version = "0.0.1"
edition = "2018"
[dependencies]
zaplib = { path = "../zaplib/main" }
- and a Zaplib entrypoint for WebAssembly, in
src/main.rs
.
use zaplib::*;
fn call_rust(_name: String, _params: Vec<ZapParam>) -> Vec<ZapParam> {
vec![]
}
register_call_rust!(call_rust);
Port STL loading to Rust
Add a function to src/main.rs
for STL loading. We can mirror the algorithm we have in JavaScript. Here's what that looks like:
fn parse_stl() -> Vec<ZapParam> {
let mut file = UniversalFile::open("zaplib/examples/tutorial_3d_rendering/teapot.stl").unwrap();
let mut data = vec![];
file.read_to_end(&mut data).unwrap();
const HEADER_LENGTH: usize = 80;
let num_triangles = get_u32_le(&data, HEADER_LENGTH) as usize;
let mut vertices = Vec::with_capacity(num_triangles * 9);
let mut normals = Vec::with_capacity(num_triangles * 9);
for i in 0..num_triangles {
let offset = HEADER_LENGTH + 4 + i * 50;
let normal_x = get_f32_le(&data, offset);
let normal_y = get_f32_le(&data, offset + 4);
let normal_z = get_f32_le(&data, offset + 8);
for j in (0..36).step_by(12) {
vertices.push(get_f32_le(&data, offset + 12 + j));
vertices.push(get_f32_le(&data, offset + 16 + j));
vertices.push(get_f32_le(&data, offset + 20 + j));
normals.push(normal_x);
normals.push(normal_y);
normals.push(normal_z);
}
}
vec![vertices.into_param(), normals.into_param()]
}
This code looks mostly the same; here are a few notable differences:
- We make a web request and read to a file using Zaplib's
UniversalFile
API. - We use Zaplib's performant
byte_extract
module to read data. This must be imported by addinguse zaplib::byte_extract::{get_f32_le, get_u32_le};
. The module provides both little endian and big endian extraction functions for different primitive types. - We use the
into_param()
helper to convert Float32 vectors into params we can return to JavaScript.
We then integrate this to call_rust
:
fn call_rust(name: String, _params: Vec<ZapParam>) -> Vec<ZapParam> {
if name == "parse_stl" {
parse_stl()
} else {
panic!("Unknown function name");
}
}
To build, run:
cargo zaplib build -p tutorial_3d_rendering
Calling from JS
To call this function from our JavaScript, let's add the Zaplib dependency to index.html
. Add a line in the <body>
section:
<body style="margin: 0; overflow:hidden;">
Then, modify loadSTLIntoGeometry
to replace our JavaScript parsing code.
const loadSTLIntoGeometry = async (assetUrl) => {
await zaplib.initialize({ wasmModule: '/target/wasm32-unknown-unknown/debug/tutorial_3d_rendering.wasm' });
const [vertices, normals] = await zaplib.callRustAsync("parse_stl");
const geometry = new THREE.BufferGeometry();
geometry.attributes.position = new THREE.BufferAttribute(vertices, 3);
geometry.attributes.normal = new THREE.BufferAttribute(normals, 3);
return geometry;
}
A few key changes:
- We call
zaplib.initialize
with the location of our built WebAssembly binary. zaplib.callRustSync
returns our vertices and normals already as Float32Arrays, which we can plug into ThreeJS.
Great! Now let's run the example in the browser. There should be no difference in the behavior and things will load as normal, without blocking our main browser thread.
Step 3 - Rendering in Zaplib
In addition to processing tasks, we can also render to the DOM directly from Rust using Zaplib. We can draw UI primitives as well as a full 3D Viewport, which will get output to a canvas
element on our webpage.
For an introduction to basic rendering, take a look at Tutorial: Hello World Canvas. Just like in that tutorial, let's create a basic Zaplib application. Here is how our Rust code should look at this point:
#[derive(Default)]
struct App {
window: Window,
pass: Pass,
view: View,
}
impl App {
fn new(_cx: &mut Cx) -> Self {
Self::default()
}
fn handle(&mut self, _cx: &mut Cx, _event: &mut Event) {}
fn draw(&mut self, cx: &mut Cx) {
self.window.begin_window(cx);
self.pass.begin_pass(cx, Vec4::color("0"));
self.view.begin_view(cx, LayoutSize::FILL);
cx.begin_padding_box(Padding::hv(50., 50.));
TextIns::draw_walk(cx, "Hello, World!", &TextInsProps::default());
cx.end_padding_box();
self.view.end_view(cx);
self.pass.end_pass(cx);
self.window.end_window(cx);
}
}
main_app!(App);
Now we just need to connect the rendering with javascript page. To do so, remove our ThreeJS render, commenting out the entirety of index.js
and replacing it with:
zaplib.initialize({ wasmModule: '/target/wasm32-unknown-unknown/debug/tutorial_3d_rendering.wasm', defaultStyles: true });
Note the addition of defaultStyles
, which will style our full-screen canvas correctly and add a loading indicator.
Rebuild the WebAssembly binary and refresh the screen. You should see a black background and a Hello World. Congratulations, we're rendering from Rust! ⚡️
Rendering a 3D Viewport
Let's get back to our 3D example. One of the major advantages of Zaplib is the ability to use common structs for renderable data, instead of positional TypedArrays in JavaScript. In ThreeJS, we had to provide attributes as floats, but here we can be a bit more descriptive.
Generating geometries
Let's represent a vertex struct as the below and add it to src/main.rs
.
#[derive(Clone, Copy)]
#[repr(C)]
struct Vertex {
position: Vec3,
normal: Vec3,
For each vertex of our shape, we represent each position and normal as a Vec3
, which is a three-dimensional vector of floats. We have to add #[repr(C)]
to indicate C struct alignment.
Let's also add an instance struct as the below.
#[derive(Clone, Copy)]
#[repr(C)]
struct Instance {
offset: f32,
Like in JavaScript, we provide a Y-axis offset and color per instance. This data is fixed, so we can provide it as a static. Note how much more readable this is than linear buffers in JavaScript.
}
const INSTANCES: [Instance; 3] = [
Instance { offset: -10., color: vec3(1., 1., 0.) },
Instance { offset: 0., color: vec3(0., 1., 1.) },
Modify the parse_stl
function now to generate a Zaplib geometry instead of float arrays. Let's take a look at the final function.
}
fn parse_stl(cx: &mut Cx, url: &str) -> GpuGeometry {
let mut file = UniversalFile::open(url).unwrap();
let mut data = vec![];
file.read_to_end(&mut data).unwrap();
const HEADER_LENGTH: usize = 80;
let num_triangles = get_u32_le(&data, HEADER_LENGTH) as usize;
let vertices: Vec<Vertex> = (0..num_triangles)
.flat_map(|i| {
let offset: usize = HEADER_LENGTH + 4 + i * 50;
let normal = vec3(get_f32_le(&data, offset), get_f32_le(&data, offset + 4), get_f32_le(&data, offset + 8));
[
Vertex {
position: vec3(
get_f32_le(&data, offset + 12),
get_f32_le(&data, offset + 16),
get_f32_le(&data, offset + 20),
),
normal,
},
Vertex {
position: vec3(
get_f32_le(&data, offset + 24),
get_f32_le(&data, offset + 28),
get_f32_le(&data, offset + 32),
),
normal,
},
Vertex {
position: vec3(
get_f32_le(&data, offset + 36),
get_f32_le(&data, offset + 40),
get_f32_le(&data, offset + 44),
),
normal,
},
]
})
.collect();
let indices = (0..num_triangles as u32).map(|i| [i * 3, i * 3 + 1, i * 3 + 2]).collect();
Note:
- Our vertex attributes are now represented by a
Vec<Vertex>
instead of multiple arrays. - We must generate a vector of
indices
to map vertices to triangles. Our approach here is naive, but this can be very useful for reducing memory costs when many vertices are duplicated. - Our resulting vertices and indices are eventually passed to
GpuGeometry::new
, which registers the geometry with the framework and makes it available on our GPU.
Generating geometry on startup
We now need a way to actually call parse_stl
and save our geometry. Our handle
function is the main entrypoint into the application lifecycle. One of our event types is called Event::Construct
, called once after the framework has loaded. This sounds like a good place to load geometry. Write the handle
function as follows.
fn handle(&mut self, cx: &mut Cx, event: &mut Event) {
if let Event::Construct = event {
self.geometry = Some(parse_stl(cx, "zaplib/examples/tutorial_3d_rendering/teapot.stl"));
cx.request_draw();
}
}
and add the geometry to App
.
#[derive(Default)]
struct App {
window: Window,
pass: Pass,
main_view: View,
geometry: Option<GpuGeometry>,
}
Note:
- We pattern match on
event
, which is an enum of all possible event types. geometry
is saved as anOption
type, because it will beNone
initially before loading.- We call
cx.request_draw
after this is done to tell our framework to draw. This function is the only way to force re-draws.
Defining the shader
We need a shader to represent how to render our geometry to screen, the same way we defined a ShaderMaterial
in ThreeJS. Zaplib uses custom shader dialect, which looks similar to Rust code and is cross-platform compatible with web and native graphics frameworks. Define this shader above the App
struct definition.
];
static SHADER: Shader = Shader {
build_geom: None,
code_to_concatenate: &[
Cx::STD_SHADER,
code_fragment!(
r#"
instance offset: float;
instance color: vec3;
geometry position: vec3;
geometry normal: vec3;
fn vertex() -> vec4 {
return camera_projection * camera_view * vec4(vec3(position.x, position.y + offset, position.z), 1.);
}
fn pixel() -> vec4 {
let lightPosition = vec3(20.,0.,30.);
let lightDirection = normalize(position - lightPosition);
return vec4(clamp(dot(-lightDirection, normal), 0.0, 1.0) * color,1.0);
}"#
),
],
Read the above carefully, and compare it to our previous JavaScript shader, reproduced below.
const material = new THREE.ShaderMaterial({
vertexShader: `
varying vec3 vPos;
varying vec3 vNormal;
void main() {
vPos = position;
vNormal = normal;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position,1.0);
}
`,
fragmentShader: `
varying vec3 vPos;
varying vec3 vNormal;
void main() {
vec3 lightPosition = vec3(20.,0.,30.);
vec3 lightDirection = normalize(vPos.xyz - lightPosition);
gl_FragColor = vec4(vec3(clamp(dot(-lightDirection, vNormal), 0.0, 1.0)),1.0);
}
`,
});
Some key differences:
- Zaplib shaders take in both a default geometry and an array of shader fragments to concatenate. We pass in
None
since we are defining a custom geometry, and prependCx::STD_SHADER
to get default shader properties. - Like in JS, we use
instance
parameters. The order here is very important and must match the alignment of theInstance
struct, as we interpret it linearly. - We use the
geometry
parameter to deconstruct the values of our vertex attributes. The order here is similarly important. - Instance and geometry arameters are available to both fragment and vertex shaders, so we do not need to use
varying
variables to forward them.
Drawing a mesh
Now that we have both the geometry and shader defined, we can add our geometry to a Viewport3D. The Viewport, like many other UI widgets from Zaplib, is provided by the zaplib_widget
crate. Add it as a dependency in Cargo.toml
zaplib_widget = { path = "../zaplib/widget" }
and import it at the top of src/main.rs
.
use zaplib_widget::*;
In our draw
function, add the following between begin_view
and end_view
.
self.window.begin_window(cx);
self.pass.begin_pass(cx, Vec4::color("300"));
self.view.begin_view(cx, LayoutSize::FILL);
if let Some(geometry) = &self.geometry {
self.viewport_3d.begin_draw(
cx,
Viewport3DProps {
initial_camera_position: Coordinates::Cartesian(vec3(0., -30., 30.)),
..Viewport3DProps::DEFAULT
},
);
cx.add_mesh_instances(&SHADER, &INSTANCES, geometry.clone());
self.viewport_3d.end_draw(cx);
In short, this checks if we have a loaded geometry and if so, draws a viewport with an instance of it. We define an initial_camera_position
with the same coordinates as our ThreeJS sketch.
Add viewport_3d
to the application struct
};
#[derive(Default)]
struct App {
window: Window,
pass: Pass,
view: View,
viewport_3d: Viewport3D,
Rebuild the application and refresh your browser. Whoa, we're now fully rendering 3D geometry in Rust!
Lastly, let's add camera controls like ThreeJS's OrbitControls. Viewport3D
has this out of the box, but we need to make sure our event handler forwards events to it, so call viewport_3d.handle
at the top of your handle
function.
}
fn handle(&mut self, cx: &mut Cx, event: &mut Event) {
self.viewport_3d.handle(cx, event);
if let Event::Construct = event {
self.geometry = Some(parse_stl(cx, "zaplib/examples/tutorial_3d_rendering/teapot.stl"));
cx.request_draw();
Build and run the application. Pan and rotate with the mouse buttons, and enjoy your new WebAssembly rendered graphics!