Midi-Based Animation with Python Part I: Bass and Drums

I’ve been composing/improvising old school techno songs where I don’t use any presets but program the patches manually, and I remembered Animusic where they use midi to automate the animation process. This also gives me practice for writing algorithms in preparation of AP CS A(yeah it’s in java, I’ll be writing a post about a Minecraft collaborative master and client soon and how I modded Baritone.)

An animation tool called “Blender” has a python scripting API and there’s a python library called “mido” that allows ease in processing midi files.

Eventually, this project will broaden into a python package and blender plugin that will make midi-based animations very easy.

No strobe, but Minimal seziure warning!:
Nothing heavy, but given the rhythmic nature of the “light drums” then it may be worth a thought.

Here is the render that took 11 long hours. Pretty minimal demo:

Flaws
It may not be obvious, but the light flash is flawed. So a hit will start at 1 and then linearly goes down to 0 like a sawtooth wave. However, since the keyframes are interpolated, the “triangles” when far apart.

The keyframes are generated for every frame, this is inefficient and reduces the ability to take advantage of splines. The power of reticulating splits should never be ignored. The reason for writing frame by frame is so that when a hit happens, the “curve” will immediately go up no matter the current position or state of moving down.

Here is the code:


As it is a proof-of-concept style sandbox thing, the code is uploaded as a gist, not a git. One day… one day…

The song is called “Mayhem” and it’s on Bandcamp here:

Hope you enjoy the old-school PowerPoint style cover art. The album is a preview album which is why the mixing sucked. But that’s also the style with early rave music to some extent so I’ll have to learn how to balance between perfection and grunge.

Also, do you know that Animusic is not dead but is actually starting to resurrect itself? Check out https://www.animusic.com/

Thoughts on Python:
I do understand the rationale between “duck typing”; however, there should be a way to specify what data type a function should have. Also, having member functions to have (self) even though they are member functions is kinda boilerplate for a language not bent on boilerplate. Development is relatively peaceful.

One fascinating feature in python is that code need not exist in any function, not even main. And any script can be typed by hand to an interpreter and run like so without any compilation :slight_smile: It is slow however and baking times were sometimes long. Perhaps using compiled python could be worth a shot in more large scale instruments and general projects.

5 Likes

Very nice. As a fellow blenderer I approve.

1 Like

Are you also a fellow believer in the coming of Animusic 3 marking a change within reality as we know it transcending even the scope of our imaginations?

It took 11 hours? Did you use cycles?

1 Like

Yeah, with gpu, given the small scope of the demo I could have just render 1 minute with less quality, but I wanted some for documentation purposes and memory :slight_smile:

Would you know any good guides on optimizing Cycles settings(other than that one on the top of the search results that’s been around for a while lol)

1 Like

Would it be possible to implement ReWire so I can dump midi straight out of a DAW and into the program? It would be super cool if you could.

Wow this a great idea. As an avid blender user, I can’t wait to see this plugin at full potential! Yeah and switch to evee rendering, at least for test renders, 11 hours of your life have been sacrificed to the gods of pathtrace.

Honestly all the animusic I’ve seen is pipe dream and pogo sticks. But I love the concept.

Very good idea. As a fellow DAW user I approve.

Amen, sadly, getting animated emission materials takes a lot more work(and requires scripting according to the docs if you want frame by frame animation).

I do have absolute faith in the Blender Godz that they will honor their convenant of providing frame by frame light caches.

The reason I used light is that I could easily code a sawtooth wave which would look good as a “flash of light” If I had used physical movement, the object’s sudden “teleportation” would look unnatural.

Adding premptive motion requires more thought because of the case when “the hammer is rising but now needs to go down for the next hit” Overwriting will not work. You will get a hammer on crack, weed, and more crack XD

Also, while looking at animusic tributes, I found a guy who already made a plugin that appears to be widely used among the community. Being old as it is, it has no github repo and is meant for older versions of blender: http://blendit.xaa.pl/index.php?p=middrv&l=eng

My plan however, is to code independently my own plugin and see similarities and differences. I did skim the manual and already did see that we were having the same concepts such as the “overlapping” of events that occur close to each other. To this day, I have refused to look at the code and will painfully continue to do so.

So yeah, instead of going 10 feet to the water park, I’m going all around the world (Just la la la la la)

I will be so happy if someone gets that reference, I’m not even a millenial, I’m gen Z.

1 Like

There is, at least in part!

Runtime checking is also possible with isinstance(). Still isn’t forced or default, but, as you mentioned, that’s by design.

1 Like