Animusic’s virtual reality instruments of the future

February 7, 2011 by Sarah Black

A scene from an Animusic animated musical video. (credit: Animusic, LLC)

Animusic’s fascinating and novel approach to creating and animating virtual instruments is full of possibility for the future of augmented and virtual reality.

Wikipedia | Animusic is an American company specializing in the 3D visualization of MIDI-based music. Founded by Wayne Lytle, the company is known for its Animusic compilations of computer-generated animations, based on MIDI events processed to simultaneously drive the music and on-screen action, leading to and corresponding to every sound. Unlike many other music visualizations, the music drives the animation.

While other productions might animate figures or characters to the music, the animated models in Animusic are created first, and are then programmed to follow what the music “tells them” to. ‘Solo cams’ in the Animusic DVD shows how each instrument actually plays through a piece of music from beginning to end. Many of the instruments appear to be robotic or play themselves using curious methods to produce and visualize the original compositions. The animations typically feature dramatically-lit rooms or landscapes.

The music of Animusic is principally pop-rock based, consisting of straightforward sequences of triggered samples and digital patches mostly played “dry”; i.e., with few effects. There are no lyrics or voices, save for the occasional chorus synthesizer. According to the director, most instrument sounds are generated with software synthesizers on a music workstation.

Many sounds resemble stock patches available on digital keyboards, subjected to some manipulation, such as pitch or playback speed, to enhance the appeal of their timbre. The animation is created procedurally with their own proprietary MIDImotion software. Discreet 3D Studio Max was used for modeling, lighting, cameras, and rendering. Maps were painted with Corel Painter, Deep Paint 3D, and Photoshop. They have also created their own software called AnimusicStudio.


Animusic LLC | What is Animusic? Virtual Instruments performing with precision timing. Individual music animations, or “music videos” ranging from about 3 to 6 minutes. As a collection, they form “visual albums,” in the form of DVDs. Like records where you can see the music. Animusic 1 has 7 completely different animations; Animusic 2 has 8. Both have quite a few bonus features, too. It’s all created digitally, utilizing a process similar to that used to produce computer animated movies (although we apply our own “secret formula,” essentially causing the instruments to magically animate themselves). The purely imaginary instruments perform in their native settings. The designs are fairly concrete. We aim for enjoyable virtual settings — existing only on the screen.

This digitally created video, below, shows a virtual instrument performing music. The CGI imagery is generated and matched to the music using Animusic’s MIDIMo. (video credit: Animusic, LLC)

People have often asked us what software we use, and if it’s available commercially. Animusic uses a production pipeline based on proprietary software we call ANIMUSIC|studio. It is a MIDI sequencer and animation system based on a visual programming language (looks like boxes connected with wires). At the core is our motion generation software library (in its 5th generation) which we have come to call MIDImotion.

None of our software is currently available commercially. We use commercial software for modeling, shading and rendering, while the instrument animation is always calculated procedurally using custom created software.Our current pipeline hinges on a total rewrite of ANIMUSIC|studio from the ground up (more about that in this Newsletter). It’s based on scene graph technology, has a new sequencer, and even MIDImotion was re-written to be much more real-time.

http://www.youtube.com/watch?v=hyCIpKAIFyo

And as much as we liked so many things about 3ds Max, it was time to make all things new. So we’ve moved to SoftImage XSI (and maybe a little Z-Brush) for modeling, and back to RenderMan for rendering. ANIMUSICstudio does all the sequencing internally, so no more exporting and importing MIDI files. Instead MIDI is sent over Gigabit Ethernet to a second workstation dedicated to hosting VST Software Synthesizers.

More about MIDImotionWithout MIDImotion, animating instruments using traditional “keyframing” techniques would be prohibitively time-consuming and inaccurate. By combining motion generated by approximately 12 algorithms (each with 10 to 50 parameters), the instrument animation is automatically generated with sub-frame accuracy. If the music is changed, the animation is regenerated effortlessly.

Our technique differs significantly from reactive sound visualization technology, as made popular by music player plug-ins. Rather than reacting to sound with undulating shapes, our animation is correlated to the music at a note-for-note granularity, based on a non-real-time analysis pre-process. Animusic instruments generally appear to generate the music heard, rather than respond to it. At any given instant, not only do we take into account the notes currently being played, but also notes recently played and those coming up soon. These factors are combined to derive “intelligent,” natural-moving, self-playing instruments. And although the original instruments created for our DVDs are often somewhat reminiscent of real instruments, the motion algorithms can be applied to arbitrary graphics models, including non-instrumental objects and abstract shapes.

http://www.youtube.com/watch?v=XBOQcQO0IFI

Related:
Animusic's YouTube channel here
Animusic, LLC