Category Archives: Embodiment


Everything we know and understand is connected to our senses. Our perception and the things we know can be looked at as data that we collected resulting from our interactions with the world around us. We know that stoves are hot  because we touched one as a kid. Or maybe you know its hot because you remember your mothers voice telling you that it was. Perhaps you realized it was hot because your previous relation of the color red to things that are hot. The things that we have learned through the use of our senses form a collection of memories that we subconsciously go to when we make decisions

As we live we interact with our environments as well as other living creatures. Another response to these interactions are our emotions. Emotions are much more complex than the raw information that you may get from just using your senses. For example, if you touch a stove you will feel the heat and that information will be stored in your brain, however there is a complimentary reaction that takes place, a emotion. As well as now knowing that the stove is hot your brain may also respond to this experience by getting sad or upset. So now instead of simply knowing the fact that “stoves are hot” you have a memory that probably sounds more like “‘stoves are hot’, this I know based off of this sad memory of when I burned myself”. We separate our experiences and interactions based on the emotions that they produced within us.

I believe that computers have already started to reach our bodies in different ways. Virtual reality and Augmented reality use our senses of sight and hearing to produce an experience that is more immersive than others. There are computers with touch accommodations for those who may be visually impaired or blind. There are countless other ways in which computers use our senses to make for a better experience however emotions are a much more complicated equations. While humans may be able to attempt to program computers that play on common human emotional responses there will always be outliers do to the fact that we all have different factors that influence our moods. So the question becomes how do we make computers that not only understand human emotions but also use them to make an individualized and unique experience?  However in my opinion trying to create a system that truly understands emotion is a mistake, I feel that to be able to understand and relate to emotions is something that is unique to us as humans and I feel like before we put our efforts towards giving that part of us to computers we need to focus on improving human to human interactions first.


Our body is an important channel that we use to take in all sorts of information. The amount of information that is taken in is huge, but our bodies and minds do not process it all. Whatever we deem important has to go through a process to be stored into short term and long term memory. The process includes many steps, but wondrously happens instantaneously. Our bodies allow us to take in everything around us.

Emotions can drive many parts of our psyche. They are a huge part of our cognition, and they definitely are the fuel to many of our decisions. Emotions can help us think certain way about things. When our minds go to process information and memories, the intake is influenced by a factor of things such as emotions. Emotions can affect how you remember something, and that in itself can teach us things about ourselves, make decisions regarding present experiences, and either cloud or clear our thinking about something.

Computers can reach your body quite easily because they are a too. Our bodies can use them, so computers need to be useable to reach our bodies. For our emotions, it might be a bit more complicated because computers have to be used in a meaningful way to affect our emotions. “Meaningful” could mean so many different things as well, because people’s experiences are so different and the emotions that come with them are as well.

Reading Response to Embodiment

In the reading, “You Are Not Your Brain,” the writer states that human’s body plays a more vital role in terms of control the way human think and behave. His argument is based on several studies done by professional researchers in the world. Firstly, the author points out that languages people use in their daily life always contains a bunch of embodied metaphors related to the physical world that people’s body experiences. To be more specific, words that people utilize to expresses their feelings and thinks are always metaphors. For example, as people use “up” and “down” to describe their emotions, they are indicating that their body temperature is going up or down. The physical reaction people have to the change of surround environment influences people’s cognition. Then, in the other reading, “Siri Is Not ‘Genderless’,” the writer shows his point of view that gender is able to influence people’s experience and cognition when applied to technology. In other words, female voice of Siri, a genderless machine, enables technology and machine to reach human’s cognition and emotion. This indicates that, in order to help technology to reach human emotion and body, feminized voice is a successful way. What’s more important is that, based on reading before that talks about embodied metaphors, designers today could create more methods like the female voice to reach human’s cognation. As far as I am concerned, in addition to vocal experience, temperature, smell and other physical reaction to the exterior world could be utilized for computer to reach human’s body and even emotion.

Our understanding of the world is based on the “sensors”: eyes for light, nose for smell, ears for sound, mouse for taste, skin for temperature and pressure. Those Sensors gained informations and transferred it to our brain through nerve system. The brain processed those information and then we could learn how the world looks like.

Emotions, in some cases, are the part of our default setting of our body. For example, anger would be activated if something very bad happened and the source of the bad thing is very obvious, and the reaction of anger would be violent actions. Of course, emotion and not control your conscious. But this system would produce hormone to effect your conscious. Then we are effected by emotion with out realizing it.

Computer could reach our body easily: the graphic to the eye, the sound to the ear, the vibration to the skin. Nose and mouse could be reach as well with some more expensive devices. But emotion is another story. If a computer want to reach one’s emotion, it have to calculate different factors of the human to estimate what is the emotion at that moment. So that would require many biology research basic.

Computers and Emotions

Our body is important to our understanding of the world because our body is our tool of interactivity. Our body is not just impacted by our mind, but our mind is changed by our body. The mind is a concept and doesn’t physically come in contact with the world like our bodies do. So, our bodies are important because they take inputs from the outside world while our mind makes sense of it.

We use our emotions to think because we are not like computers. Humans are not able to look at something completely objectively. Whether we like it or not, our past experiences will allows cloud our judgement so our emotions will always impact the way we think. For example, if you touch a hot stove, you feel distress and discomfort. These emotions impact the way you think because next time you’ll know better than to touch the stove. This is a less extreme example but this happens on a much bigger scale in the brain. Trauma deeply impacts the human brain and can stop people from doing many things. These emotions that are attached to the memories of actions or anything really can impact thinking so much that people can have mental illnesses like PTSD.

Computers can reach our body through many mediums. The most obvious is the touch input that people give computers to complete commands, like typing. However computers have greatly advanced over the years. Computers can be used for motorized prosthetics and other aids for people with physical disabilities. Computers touch your emotions because they give you access to everything. People can talk to each other through computers and interactions with other people always involve emotions. Also other emotions computers can reach are emotions of awe or anger at the speed of the computer or the trickiness of the inputs in the computer. Interactions with computers many times involve emotions so technically then computers are reaching your emotions.

The Body

A few weeks ago we discussed how our hands contribute to our understandings of the world around us in a previous reading. This week we are expanding on this discussion and evaluating our entire body as an information synthesizer. While this week’s reading focused on examples like embodied cognition, there are many other examples in which our bodies directly provide us knowledge on our surrounding environment. Our bodies are constantly taking information  to create spatial map of our surroundings. In order to read my writing, you need to look at a screen (a computer or smartphone); yet, as you look upon this screen, you can be confident that the space behind you has not changed.

How can you be confident about that? Your eyes are fixated on to screen and you literally cannot see what is behind you. Our bodies are constantly collecting sensory information that our minds evaluate to create a map of our surroundings. Since the sensory information your body is collecting has not dramatically changed, your mind can assume that your surroundings, even the things behind you have not.

By collecting sensory information, our bodies allow us not navigate our wold more efficiently. We do not need to stop before crossing the street to determine if the sidewalk is stable. Instead if the motion in which our feet hit the ground does not align with our mental definition of “stable” surface (such as, it does not wobble or move), our feet will inform our brains that the road may not be safe. As I write about the relationship of the mind and body, specifically how the body influences the mind, it becomes more and more evident that designers must take into account how our bodies gather information when creating powerful interaction designs. For the past two years I studied cognitive science and simply assumed that the information I know about memory is all I could apply to my UX designs. While it is important to recognize the limitations in a user’s memory, a designer must also be aware that reading/listening is not the only way a user is gathering information about the product.

Designers must also focus on the influence of emotion. Perhaps the most fascinating piece of information I have learned in my background in cognitive science is that memories are dependent on emotions. Unfortunately, your memory of an experience is never accurate. Instead, every time you remember an experience, you are actually simply remembering how you recalled the experience last time. Thus, whatever emotions you were feeling about the experience wash over the memory, greatly altering the memory. For example, childbirth is the most painful experience a woman will have to endure in her lifetime. However, during childbirth and continuously afterward, the new mother’s brain is flooded with dopamine. Why would the brain do that? It is because as the mother starts recalling this extremely painful experience, the extra dopamine ensures that she remembers it less negatively. As a result, the new mother would no longer remember childbirth as extremely painful and maybe even traumatic, and now will be willing to have another child.

By taking into account emotions, designers can be more aware and can better control the experiences their users are having with their design. For instance, if a designer wants the user to continue to use a product, they can add things that would spike up a user’s dopamine level (like a pleasant sound or funny meme) during the experience. They can tailor the notifications a user receives about a product (such as an app) resonate positive emotions and further ensuring that the user remembers the product positively.

One of the most valuable things I learned while taking a course on design thinking is that designers need to truly listen to users and at times even read between the lines, because users may not be able to vocalize what they are experiencing or need. It would be a shock if a user could vocalize what their body or mind was experiencing as they used a product, and thus, by having knowledge on the mind and body, designers can truly understand their users.


We are born with the ability to observe the world. Through our eyes, we see; through our nose, we smell; through our tongue, we taste; through our ears, we listen; through our skin, we feel. Our body is the most well-structured apparatus in this world. It carefully measures all aspects of our surroundings and let our brain to organize the messages it receives. While processing these messages, we reflect on them and learn to make decisions. Our body is the very first and most essential step for us to understand the word. Without using our body to feel the world, our thinking process will definitely be affected in a negative way or stay stagnant. Only when our body has the primary physical experience can we understand the world better using metaphors as a bridge. As mentioned in the article, “A Brief Guide to Embodied Cognition: Why You Are Not Your Brain”, people tend to use metaphors to comprehend abstract concepts, and “metaphors are more than mere language and literary devices, they are conceptual in nature and represented physically in the brain.”

I don’t consider thinking using only emotions as real thinking, because emotions can drastically affect our judgment. Rational thinking tells us possible results we may have when we are deciding whether to do something or not. On the contrary, emotional thinking leads us to do whatever makes us feel good even if sometimes the results of such behavior can be quite unpleasant. It’s good if our emotions are positive, but no one can promise that he/she will always be happy throughout the rest of his/her life. Negative emotions like anger, jealousy, anxiety, and depression will corrupt us if we only think emotionally but not rationally.

Computers can reach our emotions easily for that we post a lot of things about ourselves. I think attaching sensors(such as Apple Watch) to our body enables computers to reach both our body and our emotions. The sensors can detect our heart rate, vital capacity, pulse rate, etc., then computers can make calculations according to their vast database in order to figure out our emotions base on the fact that our body reacts differently when we have different emotions.

Body and emotion

Bodies are like our sensors to the world. How sensors communicate with the world largely decide how we think. The progress behind it has no difference with how robots work– detecting, information transition,  processing, transition again, and then reacting. The work of our brains is only the processing part, one fifth. It is our bodies allows us to interact with the world. Besides, how we define things is also based on how we feel with our bodies. However, it does not mean our bodies are more important than brains. Emotions happen in our brains and influence the way we think a lot. We feel negative when we prefer to refuse or deny things and positive when doing oppositely.  Since more voice assistants come out, those weak AI started to interact with our humans more man-likely, the way they are designed. He can not treat them like cold machines anymore. Like what is pointed out in the discussion about the gender of Siri, those assistants can affect our feelings with their voices and responses. The history of the effect is longer than that but the effect went more apparently when our computers get more humanized.

Embodied Cognition

Before reading “A Brief Guide to Embodied Cognition: Why You Are Not Your Brain”, I believed all reason and thought was based within the mind. I’ve never spent too much time thinking about why I made the decisions I did or said the things the way I said them, but after reading this article, it made me more aware of where my actions stem from. We have more senses than just sight and they all play a role in how we understand the world. One example, of how our bodies internalize and react to things, that stood out to me was our bodies behavior to language and how we think metaphorically. When posed with certain questions or words, our bodies carry a certain behavior/motion that we associate the word with and this is all based on personal experiences or general language meaning. Given these experiences, we reason and decision make based on the emotions we have felt during them and express ourselves through such emotions. This makes me think of the people who have experienced traumas. They have a different way of understanding the world and will react to things differently and feel different things based on language. I wonder if their ways of thinking metaphorically are altered too.

I believe computers can reach our bodies and our emotions if we  expose them to different human experiences and program them to internalize them in different ways based off body language and oral language by using a metaphoric thought process. If we study the human enough and find the very boiled down reason for the way we react to things, then computers can adopt these habits and potentially grow with them.



It is obvious that our five senses help us to understand our world. Listening, touching, smelling, tasting and watching play extremely important roles in our interaction process with the world. People with disabilities of these five senses have to pay greater efforts to overcome the difficulties in life, for example Hellen Keller.


I think it is hard to figure out how do we use our emotions to think, but emotions do affect our ways of thinking. People who are angry, usually feel anxiety about their tasks. On the other hand, people who are joyful seems to have more patience while dealing with the works.


The easiest way to interactive with computer is to touch. And actually, we have already done this a lot. So, we are now thinking about ways that are even more convenient and efficient. There are science experiments that can verify that when we are thinking, the Electroencephalogram will wave in a more dramatic way, this shows that our thinking can be measured. And by relating this measurable data, I think computers can reach our brain. There are also a lot of appearance features that computer can identify. Computers can recognize people’s facial expression by identifying subtle details of people’s faces. For example, one’s eyes will open bigger when they are surprise, and people will frown when they feel sad. By organizing these kinds of details, a computer can accurately reach to people’s emotion.

One’s body is crucial to their understanding of both the environment around them and to the larger world. Its physical confines and senses serve as our window to the world around us, undoubtedly influencing our processing of said spaces. Embodiment provides a framework for this concept.

As our understanding of the world happens through physical space and physical interaction, a computer that wishes to attach itself to physical cognition most also operate within this space. Beyond simply allowing the user to type on a keyboard, computers that offer a greater variety of input types can better take advantage of embodiment. For example, Dan Oved’s project at Maker Faire New York that used his created PoseNet algorithm to move a physical sculpture [1] could instead use subtle changes in posture observed by researches as an integrated emotional input mechanism. [2]




Our body is the tool we use to perceive the world. Depending on whether one views the brain as embodied or disembodied, will determine if one feels that “reason is disembodied because the mind is disembodied” or “reason is transcendent and universal.” (McNerney) From this, we take take that either our bodies determines the way we perceive the world or our bodies are a tool our brains use to perceive the world. Regardless, our bodies play a major role in our perception of the world and what we take in from the world.

Emotion is defined as a natural instinctive state of mind deriving from one’s circumstances, mood, or relationships with others. Emotion goes together with intuition. We don’t decide to use our emotions when making a decision; it just happens. Like in “Siri Is Not ‘Genderless’,” we see how just based on listener’s perception that Siri is female changes the overall Siri experience. Marketing technology as “feminine” has been a popular marketing strategy throughout history. We use our emotions to think by applying our instinctive perception of something to the decision at hand.

Computers greatly influence the way we think through the constant out-pour of information. Issues such as, body image, cyber-bullying, and jealousy, have all gotten worse because of one reason: the computer allows us to compare ourselves to others. The constant bombardment of what’s trendy or who’s doing what has monopolized so much of our time without us even noticing or caring. Another way technology is influencing our bodies is in the field of medicine. As mentioned in “Cyborg Manifesto,” topics regarding reproduction, genetic modification, and regeneration may pose an issue with some people. Medicine is on track to advancements in these fields, but computers, more specifically cyborgs, could do these things for us. Therefore, the question from last class shifts; is it unmoral just for humans or is it unmoral in the general sense?

Bodies, Emotions, the World

Our bodies are the sensory receptors of our experiences. They also serve as the output source for our reactions to these experiences, whether these are emotional, physical, and oftentimes a combination of both. For example, annoyance is always accompanied by at least a side glance, at least for me. The body are greatly influenced by metaphorical thought as well, and this translates to our perception of the world. In a way, bodies are like computers, receiving inputs, executing commands; they’re just a lot frailer and emotional and more flawed. It’s cool though. That’s the human experience.

While it would be nice if our decisions were driven by rationality, they’re at least 90% of the time controlled by emotions. Like right now, I’m up at 2 AM doing homework, even though the rational thing would be to take care of myself and go to sleep, but I’m in a weird state of sadness that’s making me want to torture myself. So here we are. Rationality is cool, and listening to other people try and reason with you is cool, but y’know what’s even cooler? Just doing everything based on impulse and regretting it later.

Computers are bridges between bodies and emotions. Through them, we can connect with different people around the world. I found all my favorite artists through social media, so now I can keep up with their work and draw inspiration from various sources all in my Instagram feed. If we’re talking about how computers themselves can reach us emotionally, as opposed to the content or functions of computers, we’re getting into Siri territory. I refer to her specfically because of that article on how Siri represents feminized labor and technologies. This kind of information contextualizes and humanizes the computer somewhat, so that it might become interesting on a more personal level.

James’ Emboniment

Our body is probably the most essential tool that that aids in understanding our world. As we learnt from our old readings, our hands aren’t just made for holding. They feel, weigh and do so much more. Without it our understanding of our surroundings would be severely limited. Even though our brain takes all those interactions and processes them, it would be impossible to understand the world if it wasn’t for the body. The body is like a  computer. We use a computer’s hardware/body to communicate with the computer’s software/brain similarly we use our body’s to understand our environment. However, computers don’t have emotions or at least emotions like how we understand them. We humans on the other hand have emotions and these emotions help us make decisions. For example I wouldn’t lie to my mother because I am afraid of the deep remorse I’ll feel when she finds out I lied(which she always does). Our emotions are like consequences or rewards for the decisions we make. When something positive is done we are happy and the brain releases the dopamine. When we make bad decisions and we deal with the consequences, we are rewarded with debilitating retrospection. The only way we can help computers understand our emotions is by using A.I.. We can just feed the A.I. with human actions or decisions and their consequent emotion.

Understanding the World with Our Body

Our body is important in our understanding of the world by it receiving input, processing input, and storing it for later use. Then, our brain does its magic and categorizes all of our input. Our brain also lets us think critically with all of our daily observations to make us feel things and to form our opinions. We receive input through the five senses: touch, smell, hearing, taste, and sight. Touching the teddy bear you have owned since you were six can make you feel comforted. Smelling the scent of lavender can make you feel relaxed. Listening to your favorite song that you have had on repeat the whole entire can make you feel happy! Tasting freshly baked chocolate chip cookies can make you feel warm inside. Seeing your “A” grade on your paper can make you feel encouraged and excited.

Noticed how I focused all of my examples on positive emotions?

What if I were to say negative emotions?

Touching chewed up gum beneath your chair can make you feel disgusted. Smelling the pungent New York City’s garbage while walking down the street can make you feel icky. Listening to the screeching sound of chalk against a board can make you feel cringed out. Tasting liver and onions can make you feel like throwing up. Seeing your partner cheating on you can make you feel heartbroken.

When bad things happen in our lives, we can choose whether or not to let those things affect our happiness. More often than ever, we tend to focus on the bad. Then, it becomes difficult to work or to concentrate on tasks that we need to get done. People work better when they are in a good mood. There is way more motivation when there are positive vibes flowing through. Everything is a learning process though. Once we know how we feel about certain things, we can choose what to avoid and what to follow with.

Computers can reach our body and our emotions by contributing to us finding peace and happiness. Computers have the internet! The internet allows us to search up the song we constantly have on repeat or the freshly baked cookies we want to eat. When we are upset, we can communicate with our friends online and seek comfort from them or we can look up cute compilation videos of dogs.

Robert’s response to embodiment

The core of the embodiment idea is that “the mind is not only connected to the body but that the body influences the mind, is one of the more counter-intuitive ideas in cognitive science (Scientific American).” This psychology study shows that we think not only with our brains but also with our bodies. Our body has many valuable sensors to help us understand the world. Moreover, our mind is designed to fit into this body, that the metaphorical ideas of warmth, up, pressure are deeply programmed into our brains. Our brain thinks metaphorically without our notice.


We use emotions to think all the time. Our thoughts can vary significantly with different emotional states. We think the other people are nice when we are happy; we think other people are rude when we are down. Crazy people think the world is crazy. Emotion is like a glass with color that we were. When we are happy, we tend to pay more attention to happy things.


We typically think our mind control how we behave. However, studies have shown that our body can also control our mind. For example, one group of males is asked to go through a wide well-paved bridge, and the other male of people is asked to go through a bridge which looks dangerous. At the end of the bridge, there is a female experimenter waiting for them who asks them a few questions and give the participants her cellphone number for further questions. The result shows that the group of people who go through the dangerous bridge called the women significantly more than the other group. The explanation is that people who go through the dangerous bridge have a higher blood pressure and heart rate, which is the behavior of nervousness. The same kind of nervousness happens when we meet our love. So the body was telling the mind that all of those signals, and the mind respond to them, thinking that we find our true love.


Our bodies and minds in relation to technology

A tool enhances our every day experience in life. Glasses help us see better. A watch keeps us organized. A can helps us walk. These pieces of technology help us every day. To further enhance this we can further develop these technologies. Glasses can become smart glasses, providing us directions and notifications. Watches have become smart watches providing step counters and notifications and directions. But they can become attachments of our bodies by adding hologram capable. Canes can include 911/emergency buttons. Our bodies can act as empty breadboards and we can keep adding innovation to them to enhance our experience.

Our emotions drive most of our thoughts. From when our parents tell us to study instead of watching television to when we lay on our death beds. Every thought in between that, no matter how rational we think we’re being, our emotions have impacted the decision we’re making. Computers can take emotion out of the equation. But when placed in a life or death situation, which rule fits best? The greater good theory, where the least amount of lives are taken for the greater good, or the one that protects you? Should it even protect you? What if there are only two people? Which life does it take? Yours or the other persons? These questions are important and they lead us to ask: Do we really want emotions taken out of the equation?


It’s easy for us to overlook how much our body is used to understand the world. All five of our senses and all of our body parts in some way contribute to how we feel or understand our world.  With our sense of sight, the colors of Times Square at night or the blue sky on a nice day makes us feel a certain way, they can  make us happy, excited, gloomy, annoyed, and so forth. The smell of marijuana in Washington Square Park or the lovely smell you smell when you walk into a Starbucks can affect how we feel as well. The texture of your clothing on your body, or the feel of the keyboard on your computer can make you feel comfortable. Music that you listen to or the loudness of a firetruck passing by can affect how our emotions are as well. Eating a good meal after a long day can make you feel fantastic as well.

I’ve listed many examples of how our senses can affect our emotion, but how do our emotions effect how we think? Well I think they can effect us in a large variety of ways. I think if we are in a good mood, we are motivated to work at our best capacity, we might be able to look at things in a different view. If we are in a bad or frustrated mood we could be too fed up on how distraught  are and it could prevent us from thinking freely, or someone with a different point of view could take it as an opportunity to think of ways to make things better.

I think computers and technology could reach our senses and emotions in multiple ways. We could satisfy many good aspects on the internet alone. We can find new music that really appeals to us. We can see art pieces and photography that gets us in a good mood or causes us to think. We can talk and communicate to others. I believe the internet is perfect for provoking us to think and motivate us to create.