# Project Development – Logic

Logic is a puzzle game with the aim of educating about the basic building blocks of digital computation: Logic Gates. By playing this game, the player learns about different logic gates (currently: AND, OR, NOT), their different characteristics, and how to use them.

The game chalenges the player with different levels. Each level contains an input binary signal and a desired output. To solve a level the player needs to connect different logic gates in the circuit to get the desired result. To reduce complexity, a color is assigned to each bit of the binary signal, this way the player can track the bits one by one according to their color.

The game was developed as an iPad app using OpenFrameworks.

Logic on the iPad – Level 6

Final Presentation critique:

• people loved the aesthetics of the game.
• Touch pads that connect the gates were a bit annoying.
• need to make sure the first levels are easy enough for beginners.
• make a better tutorial for the game.
• provide a more appealing end result. Something amazing to happen in the end, when the electricity reaches the desired result.

Next Steps:

We want to continue to develop the game and to publish it in the App Strore. For this to happen, we have some work to do:

• Make easy levels. The introductory levels needs to be way more easy. One way to do it is to provide the player a level that already have some components inside. The player needs to change or add one component to make the circuit work.
• Create about a hundred levels.
• Re-work the graphics. There is a large space for improvement in this aspect. Non of us is a designer, we need to collaborate with someone.
• Create an in-game step-by-step tutorials on: general gameplay directions, introducing a new gate to the player.
• Re-work the end result. Needs to be something more appealing. One suggestion that we discussed was safe-breaker.
• Probably many other things that I cannot foresee.

# Nature of Code Final project – final

About two months ago I stumbled upon this generative art piece done by Georg Nees around 1965. In this piece, Nees is moving a plotter head randomly with the constraint that only one axis at a time can be moved, creating closed squared shapes. I immediately knew that I want to use that aesthetics in my own work. So I decided that these shapes will be creatures in my invented world. But, what do these creatures want? Why do they exist? These questions remained with no answer.

Then, I saw a very cool video by Dennis P Paul who is sonifying everyday objects. For a long time I want to play with the idea of mapping music to shape. And so, I decided to connect the two parts. Georg Nees’s shapes will want to create music. Every shape will be a manifestation of a musical attribute.
This is the first project I made in this subject. I call it GOGOAM because thats how my 2 year-old kid says “candy”.

GOGOAM generates music by mapping geometrical properties of shapes into notes. At the heart of the system there are shapes. The shapes encode pitch and volume over time and by that embody melody reefs or loops.
Shapes cannot create sound on their own. They need a playback device. A playback device define an instrument and a rythme. The combination of a shape and a playback device (instrument) creates the melody loop.

The user can control the mood of the music by dragging shapes on and off the instruments. At any time, if he feels he got crappy shapes that don’t sound good at all, he can use the recycle portal to generate a new random shape (and melody).

GOGOAM from Gal Sasson on Vimeo.

Technical Info:
GOGOAM is a Processing sketch that sends MIDI notes to Ableton Live using the MIDIBus library.
All the music is generated by the sketch, except for the beat. The beat comes from a continuously looping track in Live. So that raised a question, how do I sync between Ableton Live beat and the generated notes from my processing sketch. It wasn’t able to find a complete tutorial on the subject so I decided to post my technique here below.

I needed my midi notes to be synced to Live’s beat. The MIDI Standard defines a special mechanism to do this. It is called ‘MIDI timing clock” and its a message that is sent by a MIDI device 24 times per beat. So if your song is 120 bpm, this message will appear 120*24=2,880 times in a minute.

We will set up Live to output these messages to the Processing sketch, and have the sketch listen and react to those messages.

1. In Live, go to Preferences. Under MIDI Sync you should see your MIDI output device under MIDI Ports. Toggle Sync “On”. Thats it. Now Live will send MIDI timing clocks over that device.
2. In your sketch assuming that you already use the MIDIBus library, paste this function:
```` void rawMidi(byte[] data)````
{
if (data[0] == (byte)0xfa)
{
println("START");
}
else if (data[0] == (byte)0xf8)
{
println("CLOCK");
}
}
``````
• This function will be called by the MIDIBus library whenever a MIDI message is received.
• START message will appear one time when you press Play in Live.
• CLOCK messages will appear 24 times per beat until you hit Stop in Live.
• Did you notice the similarity to Processing API? (START = setup, CLOCK = draw). Whenever a START message is received do the setup stuff, reset your counters and states, etc… When you receive a CLOCK message it is time to do stuff (play notes).
• For example: If you want to play notes only ON beat, set-up a counter to incement every CLOCK message you receive. When the counter counts 24 CLOCKs, send a midi note.

All of the code, including the Ableton Live set can be found here: https://github.com/galsasson/nature-of-code/tree/master/genetics

# Proposing a new mode for Processing IDE

Imagine this: you wrote a sketch that draws some circles with random colors and animate them using some physical forces simulation. Your sketch is working perfectly and you are happy. But, you think that reducing the size of the circles might give you a better effect, or changing a friction coefficient will look better. You have a lot of numerical constants in your sketch, right? the size of the circles, the range of colors to randomize, maximum speed and so on..

You stop the sketch, change the numbers and run again. You do this over and over until you get the desired result. But is this the best result, or is this the best one that you have tested?

Many times I find myself in that exact same situation. Sometimes I make some of the constants variables and tie them to the mouse position, sometimes to key pressed, and very rarely I use a GUI library to create sliders. I find the existing methods of dealing with this problem not sufficient and I would like to develop a better solution.

My proposal is to develop a mode in the Processing IDE that will automate the creation of real-time controllers over specific numerical constants in a sketch. The controllers will be presented to the user when he runs the sketch and will enable him to change numerical values in real-time while the sketch is running.

I’m proposing to implement the process of automating the creation of these control points inside the programmer’s sketch using the following method:

1. The user will define (highlight/select) specific numbers inside the sketch to be automated.

2. When the user executes the sketch, the proposed software will copy the sketch into a temporary directory, there it will be modified in a way that all the previously selected numbers are replaced by variables and modification methods will be added to the class. Processing IDE will execute the modified sketch.

3. When the user sees the sketch output window, another window (can be the IDE window) will present the code with the controllers embedded inside it in a seamless way.

For example, this code:

Will become this:

As you can see, a controller box has been assigned to every number in the sketch that the user chose. Each controller box in the above image contains UP and DOWN buttons to modify the value of the number. These UP and DOWN buttons are brought here for simplicity, the same effect or even better can be achieved by having a slider for each number.

4. When the user changes the value of a number in the code window, an event will be send to the sketch program resulting a call to one of the automatically added methods which will result an update of the variable value.

If you are an artist, interactive designer, game designer, a teacher or anyone who is doing something creative with Processing, you most probably realize the necessity of a feature like this. Having the ability to change numerical constants on the fly will free the programmer from the tedious job of refining numbers, and will let him focus on program flow and structure.

# NOC Final Project

My project will hopefully be an evolving musical instrument. It will be evolving for sure, and I hope to make it a musical instrument. I’m exploring the relationship between music and shape and want to add to all of it genetic algorithms to create an evolving system. Simple generative shapes will be the center of interaction. Every shape will be a creature that can mate and evolve over time. In addition, each creature have it’s own melody which is tightly coupled with its shape. The user can grab creatures, mate them, and place them on a unique gramophone to play their tune.

This is a first sketch of my idea. I still cannot call the sound produced by this sketch music, but I hope it will get there: https://vimeo.com/64205003

I was inspired by this generative artwork done by Georg Nees, that was first presented in 1965 (thanks to Matthew Epler).

In this piece, Nees was moving a plotter head one axis at a time, creating closed squared shapes. Every shape also have one diagonal line, most probably the one that connects end to start. The rest of the vertexes are completely random.

The technique for making sound out of the shapes was basically a copy of what this guy (Dennis P Paul) did for physical objects: https://vimeo.com/49484255

I hope to develop a technique to embed short melody into 2D shapes in such a way that the shape will resemble the melody.

# Project Development Studio – Midterm

Our project idea started as an Installation Art that shows how the computer (or the arithmetic/logic unit) works, down to the electricity level. This was our initial idea, and we felt that we are missing something. There are some existing projects and devices that simplify the adder circuit, and we wanted to do something which is more unique. We started thinking about different systems that can be seen as electronic circuits, systems in which the occurring events feed the next events and eventually becoming a result. We started to think about stories and how the characters in those stories contribute to the final narrative.

This project is about showing the basic concepts of computational logic: there is input, processing, and output. We want to create a fictional story where each character presents one bit in the input of a logic circuit. The user will switch the characters on and off and will observe the resulting narrative which is affected by his choices. This project, now, is more about stories.

One feedback that we got in class while presenting the idea was to think carefully how the user interacts with the device. We need to make sure that it is clear to the user that he should toggle the switches and then just watch. One solution will be to design the system in a way that the user cannot physically interact with the piece while its playing one story. Another solution will be to drop this requirement and allow the user to change the story while it is playing (somehow).

Another thing that was mentioned is that we should make sure it is clear that it is electricity that we are eventually talking about, and not to drop this aspect. We need to think carefully how we design the story landscape and events in a way that the observer will see the relationship between the piece and electronic circuits.

Tasks for the rest of the semester:

• Complete the story while thinking about electricity and the adder circuit.
• Design the interaction between the user and the piece.
• Shoot / Animate different events throughout the story.
• Build the physical piece.
• Program the software that will run it.

Final presentation will be a board which is hang on a wall. The board will present the story landscape and will feature different moving parts that will change position according to the user inputs. A projection on the physical landscape will present the story as it plays including different animations and sounds. I can envision this piece in a museum or a different public space of some kind. about 2mx2m.

# Project Development – Week 1

The idea of the project is to show people how computers work from the conceptual macro level to the physical micro electronic level. This project will hopefully enable people to dive into this black box and see what’s inside.

The complexity of the computer made possible by encapsulation. Small simple parts are put together into more complex parts, these, again, form together a more complex system and so on. The target in this project is to enable the user to traverse through these levels of complexity.

This project can take various forms such as:

• Computer Graphics on screen / projection
• Physical model
• Web interface

Scale: This visualization should be presented in a big scale, preferably on a room wall or floor.

Affects: playful, thrilled, confused.

Verbs: educate.

I have a great appreciation for systems that rely on simple rules that create large complex systems. I want to pass that feeling to other people.

Inspiration Projects:

Lego Turing Machine by Jeroen van den Bos and Davy Landman.

https://vimeo.com/44202270

The Turing Machine was invented in 1936 by Alan Turing, although he never built own. This machine is the father of all modern computers and in principle can execute any program.

Form: Sculpture / Functional Model
Material: Lego bricks
Scale: 1m x 0.3m
Affects: curious
Verbs: educate, motivate
Motivation: In honor of Alan Turing.
Ideal Audience: Technology enthusiasts, students.
Location: Gallery.

# Karl Sims

Karl Sims describes himself as a digital media artist, computer graphics research scientist, and software entrepreneur. Most of his works consists of using computation and genetic algorithms to create interactive computer graphics and evolving systems.

In 1994, during his studies at MIT he created a computer program that simulates Darwinian evolution of Biomechanics. He calls this work “Evolved Virtual Creatures”. He randomly created a population of different imaginary creatures built from blocks with some sort of mechanics embodied in them – they move in some way over time. Each creature have a “genetic code” that encapsulates both its shape and its mechanics. Every population of creatures got a different task such as swim in water, fight over a cube, etc. At the end of each iteration he used genetic algorithm to take the fittest individual out of the population, based on their achievements performing the task, and mate them (combine their genetic code and add mutation) creating the next generation. After many iterations he got some really cool creatures, some of them remind natural creatures behaviors.

Paper of this work:

Evolved Virtual Creatures
K.Sims, Computer Graphics (Siggraph ’94 Proceedings), July 1994, pp.15-22.

# Re: Design For Wearability

This research paper describes useful design guidelines when developing wearable devices. The guidelines are very intuitive and include: Placement, Form Language, Human Movement, Proxemics, Sizing, Attachment and a few more. The paper describes each guideline in moderate detail. Finally, the writers present their best-practice wearable forms and explain the method they used to test this system.

The study was conducted with ten people performing various motions with and without the wearable form. In the end they needed to describe the comfort level and the freedom of motion they felt.

I think this paper is useful. It gives clear guidelines and also suggest what seems to be a fairly comfortable wearable forms for various places in the body. I think this study can be improved by two things. First, there was no progress after they got the results. Maybe these forms can be improved? Second, a more scientific feedback and monitoring could give more meaningful results to this study. One way that I can think of is to capture the subjects motion while performing the movements with and without the equipment and then comparing the results using computation to see the variance between the two situations.

# Visual Language – Logo

Orange logo:

For the ITP logo I built on the concept of interaction: listen-think-talk or input-process-output and came up with this design. Maybe its too complicated.

Here is it: