Dancing Jellyfish

CS527 Computer Animation Project 3

Yu-Chung Chen, Dmitri Svistula, Ratko Jagodic


 

The Dancing Jellyfish Idea

The main theme are jellyfish dancing to the music. The scene is obviously placed underwater and to make things more interesting we added bubbles that move in a boid like fashion. The starting point was based on Dmitri's scenegraph framework and project two boid code. The motion capture data directly influences the motion of the jellyfish but each jellyfish in turn loosely guides a group of bubbles that follow the main three flocking rules. The chosen music is "The Thieving Magpie (Abridged)" from the Clockwork Orange Soundtrack.

 

Motion Capture Process

We tried using the default motion capture program that represents each tracker by its sensor ID. However, due to the nature of the motion in our animation, we found it difficult to picture the final motion of our characters. Therefore we modified the motion capture program to display a trail behind each sensor as this was more important than knowing the sensor ID. Another important piece of information we got from that is the speed at which we were moving the sensors. For example, if the sensor was moving fast, the blocks composing its trail would be further apart and vice versa. The image below show what sensor movement might look like:


Animation

Jellyfish Body

Originally we would like to something like this. But to model and animate the base is too complex and might be slow in real-time. So we try to simplify the jellyfish base with something much more simple, a scaling hemisphere. We build 30 animation frame models as display list in initialization. During runtime, we switch animation frame loaded from the list based on the timer. One drawback is that the animation will not directly be related to the jellyfish's motion since we cannot change display list content in run time. But, we can move display list calls to draw function directly, but lose some efficiency. The position and the orientation of the jellyfish is based on the position and the velocity vector of each sensor although with some orientation constraints in order to keep the jellyfish upright.

courtesy of Kayvon Fatahalian and Tim Foley

 

Jellyfish Tentacles

y = A * cos ( B * x + C) or
y = A * sin ( B * x + C)

At absolute rest, the motion of tentacles is defined by either sin or cos functions where A is the amplitude, B is the frequency, and C is phase shift of the curve. Animation is achieved by shifting the curve continuously and looking at a portion of the curve. Because tentacles should look like they are fixed at a point on the body, the amplitude increases linearly through the tentacle. This way, the points where tentacles are attached do not appear to wiggle. Frequency changes dynamically and it's calculated from the current velocity vector defined as the difference between last and current position. If jellyfish is moving up, the frequency is based on the magnitude of the velocity vector. When it goes does down, frequency is a fixed number and tentacles look more at rest. This makes jellyfish look like sinking. Movement of the body creates distortions in the tentacles. This is achieved by adding the positional curve with either sin or cos curves.

Bubbles

Each boid is texture-mapped with a round bubble image plus a white circle moving with a slight displacement from the center point of the boid based on its velocity, to make the illusion of a bubble. The bubbles try to follow nearby jellyfish but since they are not in pre-defined flocks they can go from following one jellyfish to following another one.

Synchronization of Animation with Music

Since the graphics and the motion capture data were not done at the same rate, they had to be synchronized during playback. Even if they were done at the same rate whenever the graphics update framerate would change, the animation and the music would go out of sync and it would just get worse since the error would accumulate. To solve this, at every graphics update we would figure out the playback position of the music and use that offset to arrive at the correct motion capture data. However, the requirement for this is that the motion capture data and the music be of the same length (timewise) so the beginning and the end of motion capture data had to be cleaned up a bit.

 

Code & Downloads

Since the starting code was written in C++ and OpenGL we stayed with that.
Download source.
Download binary (mac).

Questions & Answers

Nothing yet.

 

References


2006 Fall