Developing a Data-Driven Game & Wind Animation with Canvas’, Vue & d3.js

Joe Pavitt
7 min readOct 19, 2021

--

This article is part of a series of articles about the Digital Experience for the Mayflower Autonomous Ship.

In this piece, we will focus how we used d3.js to build Challenge 2. This challenges tasks users with safely navigating through multiple environmental and man-made data sets: wave direction, wave intensity, icebergs and maritime traffic/channels.

Users need to avoid the icebergs and maritime channels, whilst also staying clear of high intensity waves that are travelling perpendicular to the direction of travel for the ship.

Given the sheer number of possible paths that the user could create, we needed a system that could flexibly calculate a “score” for any of the paths available. As part of this, we also need to detect “collisions” along that path.

To achieve this we created an architecture that utilises hidden data texture layers. These textures were modified greyscale versions of the visible layers you see in the experience.

left: the visible graphic detailing the wave height/intensity; right: the greyscale “data layer” of the wave height/intensity.
left: the visible detail for where icebergs are present on the map; right: the greyscale “data layer” used for collision detection.

Each pixel’s corresponding RGB value is read by the Mayflower Autonomous Ship as it moves across the screen. In the example of our iceberg layer, a threshold is checked. If the pixel colour exceeds that threshold we consider the route a failure, i.e. the ship has “collided” with an iceberg.

In other cases, we could also accumulate sums or take a mean average of the pixel colour.

It would also be possible to store 3d, directional, data in the RGB values too, as is used when defining normal maps in the 3d industry. Whilst we use a slightly different method for wind direction (detailed in Animating the Waves section of this article), we could have painted an RGB Flow Map and used the respective RGB values to read the wave direction at any point. Using the 3-dimensional nature of colour data also means we could have stored three data layers in each texture if we were seeking to optimise image load times.

Sampling Data Layers

In order to sample an image at any x- and y-coordinate in the browser we need to first render this image in a <canvas />. To do this, we create a <canvas /> and <img /> for each .jpg that we have, representing each data layer.

<div>
<div>
<canvas v-for="layer in layers" :key="layer.label" :ref="'data-layer--' + layer.label"></canvas>
</div>
<div>
<img v-for="layer in layers" :key="layer.label" :ref="'img--' + layer.label" :src="./data-layer--' + layer.label + '.jpg'" @load="loadDataLayer(layer.label)"/>
</div>
</div>

Then, in our Vue component, we draw these images onto their respective canvas from the @load() function of the <img />.

loadDataLayer (layerLabel) {
this.$nextTick(() => {
let canvas = this.$refs['data-layer--' + layerLabel][0]
let img = this.$refs['img--' + layerLabel][0]
let container = canvas.parentElement;
canvas.width = canvas.clientWidth;
canvas.height = container.clientHeight
if (canvas.width > canvas.height) {
img.style.height = img.height * (canvas.height / canvas.width)
}
else {
img.style.width = img.width * (canvas.width / canvas.height)
}

canvas.getContext('2d').drawImage(
img, 0, 0, img.width, img.height,
0, 0, canvas.width, canvas.height
);
})
}

Now, with our canvas’ filled with data layers, we can sample any pixel (x, y) from any layer, using it’s respective ref, with the following code:

sampleDataLayerXY (layerRef, x, y) {
// get our data layer canvas
let canvas = this.$refs[layerRef][0]
// get the RGBA value at the (x, y) coordinate
let rgba = canvas.getContext('2d').getImageData(x, y, 1, 1).data;
// select the red channel
// although could in theory pick any of the RGB channels
let r = rgba[0]
// scale the values from 0 -> 255 to 0 -> 1
return r / 255
},

Each of our data layers are greyscale, black being represented by RGBA(0,0,0) and white being RGBA(255, 255, 255). We get the RGB array at a particular (x, y) coordinate for our layer from the `getImageData` command. We then select the red channel (rgba[0]) and divide it by 255. This scales our values from 0 -> 255 to 0 -> 1, where 0 is a perfectly black pixel and 1 is a perfectly white pixel.

For our iceberg layer, as soon as a white pixel is detected we consider the Mayflower to have crashed. For the wave intensity layer, it’s a little more complex. We do look for a breach of threshold, but wave intensity is only important for capsizing if the wave direction is perpendicular to the direction of travel.

// sample the x,y coordinate at the last location of the ship
const p0 = path.getPointAtLength(t0 * l);
// sample the x,y coordinate at the current location of the ship
const p = path.getPointAtLength(t * l);
// which direction is our ship facing?
const shipAngle = (Math.atan2(p.y - p0.y, p.x - p0.x) * 180 / Math.PI);
// what direction are our waves travelling?
const waveDirection = (Math.atan2(flow.y, flow.x) * 180 / Math.PI)
// angle of waves relative to the ship's direction
const relativeAngle = waveDirection - shipAngle;
// how perpendicular to the wave direction is the ship travelling?
// 0 = parallel; 1 = perpendicular
let perpendicular = Math.min(1.1 * Math.abs(Math.sin(relativeAngle * (Math.PI / 180))), 1);

Given this perpendicular value, we can then merge this with our wave height/intensity that was read from our greyscale data layer in order to compare against our threshold for capsizing.

Animating the Waves

The style we have used to render the waves is heavily inspired by the beautiful work at Earth Nullschool. I was fortunate to attend a talk by Cameron, the main developer of Nullschool, back in 2015, and so could loosely remember the methods he mentioned at the time. Additionally, I found this windy.js file which also used similar methods. The windy.js use case was far more complex than we needed though given that it’s working in lat/lng, and so, we decided to build ours from the ground up.

Our animation is built on top of a “Vector Field”, a grid of 2d coordinates that define the wind/wave direction and intensity across our screen.

Visualisation of the underlying vector field that controls the motion of the animated waves

With the Vector Field grid in place, we can then pick any x- and y-coordinate on the screen and interpolate the direction from the nearest 4 defined vector field values.

Annotated screenshot of the vector field showing which four values we would read from the vector field data in order to interpolate a velocity value at any location.

This then gives us an x and y value that defines our velocity at that point in the grid. With this velocity, we shift that particle to a new location. Every 100ms we revisit this particle, calculate the velocity at it’s current location and then shift the particle’s location again.

Animation showing the path within the Vector Field of one particle.

With motion sorted, we then sought to make their appearance more wave-like with a fading tail. This is where the power of drawing onto a <canvas /> comes into its own. We utilise the fillRect() and canvas.globalCompositeOperation functions:

const ctx = canvas.getContext("2d")
ctx.lineWidth = LINE_WIDTH;
ctx.strokeStyle = LINE_COLOR;
// define our new fill style, where DECAY controls how quickly the paths will fade away
ctx.fillStyle = "rgba(0, 0, 0, " + (1 - DECAY) + ")";
// draw a rectangle over our whole canvas with the above fillStyle.
// this gives a fade illusion on all contents of the canvas at this point.
ctx.fillRect(0, 0, canvas.width, canvas.height);

Each frame, this fill is applied to the existing frame, fading it slightly, before we add the latest path segment to the canvas. This newest part, drawn between particle.x and particle.x + particle.dx * SPEED will be rendered at full opacity, and represent to the head of the particle’s path.

// loop through all of our paths
this.particles.forEach ((particle) => {
// start drawing a path
ctx.beginPath();
// is our particle old?
// If so, relocate it (gives the illusion of more particles)
if (particle.age > MAX_AGE) {
this.randomisePosition(particle).age = 0;
}
// move our canvas "pen" to the particle's position
ctx.moveTo(particle.x, particle.y);
// calculate the new position of the particle given our velocity (dx)
particle.x = particle.x + particle.dx * SPEED;
particle.y = particle.y + particle.dy * SPEED;
// move our pen to the new location
ctx.lineTo(particle.x, particle.y);
// render this movement as a line
ctx.stroke();
// get the new x/y velocity vector at this location
let vector = this.sampleField(particle.x, particle.y);
// store our new velocity for the next loop/frame
particle.dx = vector.x;
particle.dy = vector.y;
particle.age += 1
})

It now becomes very easy for us to experiment with different path decaying times, widths and colours and particle colours, speeds and maximum ages.

Demonstrations of alternative input values that control the appearance and behaviour of the wave animations.

For our final wave animation, we had settled on the following:

const PARTICLES = 3500const MAX_AGE = 8const LINE_WIDTH = 6const LINE_COLOR = ‘rgba(255, 255, 255, 0.4)’;const SPEED = 1.5const DECAY = 0.2

Concluding Thoughts

I hope that you’ve found this article insightful and useful. Whether it be to use hidden data layers in your browser-based games, or creating interactive wind animations using Vector Fields.

Please don’t hesitate in reaching out should you need assistance in either of these tasks.

--

--

Joe Pavitt

AI, UI, Data Visualisation & 3D Modelling. Master Inventor & Senior Research Engineer @ IBM Research UK. MEng Aerospace Engineering w/ Spacecraft Engineering.