TweetBots is a project that involves 2 Braitenberg inspired Bots, acting and reacting under the influence of human emotions.
The initial attractor for developing this idea into a project was effective causality resulting in unpredictable and unprogrammed behaviours.
I was very interested in these hopefully unforeseen physical manifestations defined by what would appear to be basic intelligence. At this point I started to narrow down the basic cause and effect relationship of an intelligence based on reactions to certain stimuli. It was clear to me that if something appeared to be intelligent enough to avoid or be attracted to a certain environment, then clearly it would appear to be “alive” and hence possessing of very basic intelligence.
The stimuli I chose to isolate were SOUND, LIGHT, SMELL and TEMPERATURE. This is mostly due to the availability of electronic sensors capable of evaluating and transmitting these qualities in an environment as data.
For this particular experiment I chose LIGHT as the the main and only stimulus.
This seemed like a fairly good approach but it had limitations. The Bots appeared to be functioning on a basic level but were limited to one specific behaviour- either attraction or repulsion to light.
I started wondering how this behaviour could be initiated via human influence and even could this be done remotely. Could these bots act based on remote input?
However, the more I thought about this, the more I realized that there was an inherent contradiction in this idea. If I wished for the Bots to have a certain essence of autonomy, after all they were “alive” , is it possible at all to bring human influence in a way that doesn’t involve a direct input. I wanted to the Bots to sense their environment, but I was also interested in a more advanced way of controlling them, specifically a control that is not only remote but that doesn’t involve human interaction based specifically on a stimulus, LIGHT.
How can I have human input that is not just someone holding/withholding a light source to disrupt and influence the bots?
I chose Twitter as the source of this influence. The reasoning was that tweets can not only contain inane pointless statements but also can be a great platform for simple emotional expression. I chose to focus on these emotional expressions. How can I convert a tweet, for example ” I am so happy, I just got accepted to ITP” to a command that the Bots can execute in a physical space.
I was concerned with creating a database of “happy” words, and “sad” words, mostly due to the conflict of taking them out of context. I have to admit, there is something conceptually very interesting about losing the meaning of a tweet in translation, and misinterpreting it but for this specific project I wanted this to be as close to the original intention as possible.
The solution was fairly simple- look for the words “sad” and “happy”. More often than not , these words will illustrate their true meaning in a sentence, and hence the minimize the chance of misinterpretation. I wanted to avoid making SchadenFreude Bots.
So the basic scheme of this experiment becomes this. An origin of tweets being parsed is set as the longituted and latitude of ITP, with anything within 10mile radius being parsed for “active” words.
If a “sad” is in a tweet, it gets extracted and sent to the bot, the bot then switches his behaviour to “photophobic”, however if “happy” is present, the bots become “photophilic”
Each bot contains a RGB led that changes color based on the word present acting as a status, and also functions as a stimulus(light source) to provoke a response in the bots to their ever changing environment. Ultimately the bots roam around in a physical space, changing their physical proximity to each other based on the virtual emotional space around them(Twitter). The resulting behaviour is fascinating.
*parts were mostly sourced from Polulu, Sparkfun and Jameco.
- Tamiya 70097 Twin-Motor Gearbox Kit at a ratio of 204:1
- Round robot chassis from Polulu( amazing for balance)
- Tamiya toy tires
- SN754410 Quad Half H-Bridge( for controlling each motor individually)
- Arduino Pro 328 – 5V/16MHz
- 2.4GHz XBee
- rgb led
- round acrylic half globe(Canal Plastics)
- various wires and breadboard
-PHP using Twitter API php code(TwitterSearch) for querying for tweets based on “ID” and geographic location
-Processing to parse and define “happy” and “sad”
-Arduino to receive “happy” and “sad” messages and control Bots and behaviour
Code and other bit Thanks to Eric Mika, Daniel Shiffman, Heather Dewey Hagborg, Rory Nugent and Tom Igoe.