Mutone is an iPad app that I created to explore genetic algorithms and evolution in code. In a nutshell, you draw a bunch of audio tracks and let them reproduce when they bump into each other. As this happens the sounds mutate and change in unexpected ways. The selection process is an aesthetic decision which is made by the user. Periodically pruning the soundscape keeps the unfit tracks from dominating the soundscape.
Throughout the semester I’ve been compiling my Nature of Code homework into an umbrella app. This allowed me to build up a reusable collection of iOS classes, and it’s also nice to have the sketches in a single place. The code that resulted from this process has been really useful in a handful of other OpenGL iOS apps that I’ve been working on.
This app is now available for free in the AppStore. Each of the sketches links to its respective source code on Github, as well as the relevant chapter in Dan Shiffman’s book. Visit the AppStore to download the Nature of Code app for iPad or iPhone, or if you’d rather compile the app from source, you can grab it at the Github repo.
This is an ongoing project to build up Objective-C classes for creative coding. Pull-requests are appreciated and additional sketches are also welcome! If there’s enough interest, updates will be submitted to the AppStore.
After exploring a few paths over last couple weeks, I’ve arrived at a general idea for my final project in Nature of Code. Briefly, it’s a toy to create evolving audio landscapes. The user creates gestures in a 3D space (the genotype) which gets mapped to audio (the phenotype). Lets call these gestural/audio pairings “monads”. These monads are then released into a field of other monads and allowed to reproduce through collision. This will ideally create an continually changing sound-scape, that I can’t quite imagine yet.
The “Evolution of Code” chapter is fascinating and I wanted to use a genetic algorithm in a non-visual way. The “phasing” music of Steve Reich has also been an inspiration and I wonder if I can tease out similar transitions with mutation and selection.
There are still quite a few unknowns that I’m thinking about for this project:
1) Sound is fundamentally 1-dimensional. What are effective/intuitive ways of mapping it to 3-dimensional forms? I’m currently using three sounds per monad—one for each axis—and adjusting the pitch along that axis.
2) What’s a good balance between a controlled experience (e.g. using an instrument) and a discovery-based experience? I’m leaning more towards discovery because it fosters moments of serendipity.
3) What kinds of fitness testing (if any) is appropriate for this toy? Some kind of user-selection may be good since it’s an aesthetic experience.
4) Is there a better way to handle reproduction than collision?
So far I’ve got a simple input interface and audio playback working with OpenAL.
Here are some more sketches that have been added to the Nature of Code iOS app.
This sketch takes a page from Jer Thorp’s “Smart Rockets“. Each generation of lines are fitness tested by how close they got to the target, and how long it took them to get there. It takes just 3 generations for most of them to hit the target.
Obstructions can be added to their path by swiping the screen.
This flow field is generated with 3D Perlin Noise. The particle motion is influenced by the vector that they’re positioned above. There’s also a small repulsion force so they don’t get bunched up.
I was inspired to bang out this flocking sketch after reading The Power of Swarms in Wired, and delighted to see it form a proper torus rather quickly.
This week I’ve been exploring the Evolution of Code; Chapter 9. The process has been rather slow and iterative, and indeed feels like a kind of evolution itself. Code, observe, tweak, repeat. I’ve made some inroads into the mechanisms of an evolving system, but I’m discovering that the most challenging part is choosing which questions to ask of the system, and how to evaluate it’s success.
At any rate, here are some screenshots of my process. The goal of these sketches is to evolve a set of trails that conform to the surface of an invisible sphere.
For my midterm project I explored a few different angles; face tracking, Delaunay Triangulation, and hair simulation. The ultimate mashup is a sketch I’m calling “The Beard Booth”, and can be seen below.
Source for all of these sketches can be found in my github repo.
Here’s a basic example of Delauny Triangulation. In this sketch I’m randomly adding 2D nodes to the triangle mesh. The colors are sampled from the center point of each triangle.
We’ve arrived at the moment of truth; particle systems. Simply put, particle systems are a collection of objects that all operate on a similar rule-set. But they have a wide range of possible uses; by tweaking their variables or dressing them up, they can model behaviors like flocking birds, clouds, or fire.
This is a simple example of a particle emitter using additive blending. Dragging your finger sends a wind force toward the emitter. I’ve also included a settings panel that allows you to select any of the OpenGL blend functions.
In this example, I’ve created a “Flame” subclass of ParticleSystem. You can spawn flames by touching the screen, and they seek out the brightest regions of the paper. They will avoid their own trails, since they’re black, and the flame burns “upwards” according to the device orientation.
Both of these sketches have been committed to my Nature Of Code github repo.
This week’s installment of Nature of Code looks at oscillation and springs. As always, the code is available in my NOC github repo.
This week we looked at Vectors and Forces. I also added a simple example of OpenGL texture mapping.
What may not be obvious at the end of the Mover Swarm video is that I’m dragging the mouse to affect the acceleration, which results in the slower plumes.