{"id":129,"date":"2017-08-06T15:45:29","date_gmt":"2017-08-06T19:45:29","guid":{"rendered":"https:\/\/itp.nyu.edu\/~jvc301\/wordpress\/?p=129"},"modified":"2017-08-06T15:45:29","modified_gmt":"2017-08-06T19:45:29","slug":"when-siri-met-my-son-ux-lessons-from-my-baby","status":"publish","type":"post","link":"https:\/\/itp.nyu.edu\/adjacent\/issue-1\/when-siri-met-my-son-ux-lessons-from-my-baby\/","title":{"rendered":"When Siri Met My Son:<br>UX Lessons From My Baby"},"content":{"rendered":"<p><span style=\"font-weight: 400;\">While on maternity leave, I expected to take a break from my job as a UX designer, but found that I couldn&#8217;t get away. Watching my young son learn basic human skills became a study in human interaction, one that led me to new insights into designing digital experiences for adults. Because babies are all about interfaces. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">A fascinating thing about newborns: They\u2019ve spent nine months gestating in an environment where everything was provided for them\u2014full nutrition, a steady temperature, their waste products removed. And yet from the moment they are born, they are transformed into what could be called a learning machine. Equipped with only the basic instincts they need for the task, babies spend much of their time and efforts learning to navigate what is to them a wholly new platform\u2014their mother&#8217;s body. They quickly discover it is a very good platform for comfort and warmth, and a source to cure their hunger. They immediately learn to latch onto their mother\u2019s breast and how to suck to get milk<\/span><span style=\"font-weight: 400;\">. Within days, they recognize the pattern of a human face. And from then on, a baby confronts new interfaces every day. They must discover everything without reading a manual.<\/span><\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-878\" src=\"https:\/\/itp.nyu.edu\/adjacent\/issue-1\/wp-content\/uploads\/sites\/6\/2017\/08\/sketch4alejbig.jpg\" alt=\"\" width=\"663\" height=\"411\" \/><\/p>\n<p><span style=\"font-weight: 400;\">At first, when I watched my infant, it sparked rudimentary thoughts and realizations\u2014\u201cHuh, I guess all infants do this,\u201d or (with a slight panic), \u201cWow, he is completely vulnerable and depends on me totally for survival.\u201d Only some time later did I become more fully attuned to the process of learning unfolding before me. My thoughts turned into \u201cWow, humans are such fast learners,\u201d and, \u201cHe understands gravity? Nobody taught him that!\u201d My documentation of his milestones became a form of design research. Despite centering only on my own son, I slowly became aware of the ways in which what I was seeing could be instructive to a better understanding of the human instinct for learning behavior. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">The ability of a user to learn and adapt to a new interface, of course, plays a huge role in interaction design. And so does the ability of the designer to predict the user\u2019s response to the new system. Thus, it is not only their state of being \u2018total learners\u2019 that makes babies so interesting to look at from an interface design perspective. Talking recently with the <\/span><a href=\"http:\/\/www.independent.co.uk\/news\/science\/revealed-the-serious-science-behind-a-babys-laugh-8225400.html#gallery\"><span style=\"font-weight: 400;\">Independant<\/span><\/a><span style=\"font-weight: 400;\">, Dr. Caspar Addyman, a researcher at Babylabs, a leading infant-research center at Cornell University, said &#8220;If you are trying to understand the psychology of humans, it makes sense to start with babies. Adults are far too complex. They either tell you what you want to hear or try to second-guess you.&#8221; But if a baby does something, he concludes, &#8220;it&#8217;s bound to be a genuine response.&#8221; <\/span><\/p>\n<p><span style=\"font-weight: 400;\">Much of our appreciation for good interface design relates to how natural it is to learn. Doorknobs and light switches are good examples of intuitive design, as are touch screens, whose interfaces are so intuitive even babies as early as six months old can learn to navigate them and successfully retain them in their muscle memory.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">A considered intuitive interface design approach is critical for the successful development of any innovations that require more complex learning and adaptation. The development of early computer interfaces in the &#8217;60s-&#8217;80s is a good example. At first, only proficient programmers could use them. But the invention of GUIs (graphic user interfaces) made it possible for a wide variety of people to use computers, and thus for computers to become consumer products. The breakthrough interface work of Alan Kay at Xerox PARC played a pivotal role in the emergence of computers as a common tool for expression, learning, productivity, and communication\u2014a role no less important than such factors as hardware and chip speed. The design of GUIs was based on Constructivism, a learning theory that emphasizes learning through doing, or \u2018constructing understanding\u2019 by gaining knowledge based on iterative interactions with one\u2019s environments. Alan Kay drew on the theories of Jean Piaget, Seymour Papert, and Jerome Bruner, who had studied the intuitive capacities for learning present in the child\u2019s mind, and the role that images and symbols play in the building of complex concepts. Kay came to understand, as he put it, that \u201cdoing with images makes symbols.\u201d This was the premise behind the GUI, which enabled computer users to formulate ideas in real time by manipulating icons on the computer screen. Kay\u2019s approach made computers accessible to non-specialists. More importantly, it transformed the computer into a vehicle for popular creative expression.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Such a case study exemplifies a general principle that, I would argue, is useful for designers who see value in a constructivist approach: since babies\u2014who start with no knowledge\u2014learn through action, paying close attention to the ways in which they discover and adapt to the world can provide UX designers with valuable information and insights into interface design approaches\u2014even when designing for adults.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">This is especially true when designing in an era such as ours, which is marked by a robust breadth of new, rapidly evolving possibilities for user interaction.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">I will outline two ways in which my newfound appreciation for this principle shaped my thoughts on interface design matters.<\/span><\/p>\n<h1>VR: Create an Environment of <i>yes!<\/i><\/h1>\n<p><span style=\"font-weight: 400;\">When I asked my pediatrician why my one-year-old trips so frequently, his response was that while in my child\u2019s mind he is great at walking and capable of doing so quite fast, his body does not follow up quite as well. Thrown into a new environment with tools he is only beginning to master, my child stumbles often. And while children will of course not give up on learning to walk due to a series of falls, this is usually not the case for adult learners, where such a situation might prove highly problematic.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Virtual reality, with its ability to allow us to vividly imagine realities unconstrained by everyday physical implications, is a good example of such a problem. When a user first dons a VR headset, their perception of physical laws are often upended. The immersive nature of the media means one\u2019s sense of orientation and balance may be easily thrown off. The strength of VR to transport its users into what feels like a different space governed by different rules is also the source of one of its central weaknesses: namely, the nausea this transportation so often induces. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">While VR content creators might feel as if they are curating experiences without boundaries, it is important for us to carefully consider the effects and outcomes generated by rapid immersion into such a new environment. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">It is a well-established fact that infants and toddlers absorb information faster when they are in a familiar, nurturing environment. When switching babies to a new environment, it is important to leave some familiarities with them so they feel emotionally comfortable. Research suggests that infants repeat certain behaviors until they are confident they have mastered them<\/span><span style=\"font-weight: 400;\">. It may follow, then, that when setting up a virtual world that abides by different laws of motion, one would be wise to introduce the novel spatial elements in a slow, controlled manner, allowing for the supporting presence of features familiar from \u2018the outside.\u2019 Another important point in this respect is that babies <\/span><span style=\"font-weight: 400;\">are <\/span><i><span style=\"font-weight: 400;\">active learners<\/span><\/i><span style=\"font-weight: 400;\">: they do not sit passively and take in information. They probe and test their way into knowledge.<\/span><span style=\"font-weight: 400;\"> And while their motivation and thirst for growth may be distinctly different from adults\u2019, the essential learning principal holds: in order to be fully engaged in the learning of a new world, <\/span><i><span style=\"font-weight: 400;\">the viewer must be the protagonist, not just a spectator.<\/span><\/i><\/p>\n<p><span style=\"font-weight: 400;\">Likewise, design that allows for repetitive, user-centered behavior will expedite users&#8217; learning curve of VR situations. Dynamic, responsive interaction that takes into account the user\u2019s outside reality and sense of agency is key to helping users gain familiarity with new experiences.<\/span><\/p>\n<h3><span style=\"font-weight: 400;\">Voice recognition AI: Single mind vs. open, global-minded conversation<\/span><\/h3>\n<p><span style=\"font-weight: 400;\">Motor abilities acquisition is one sphere of learning young children are immersed in. Language is another. My husband and I come from different linguistic backgrounds, and found ourselves deeply invested in understanding how best to raise our baby in a trilingual household. Could we foster the acquisition of three languages (English, Korean, and Spanish) while avoiding causing confusion and developmental delay? <\/span><\/p>\n<p><span style=\"font-weight: 400;\">Upon delving into the literature surrounding this question, I was amazed to discover infants are able to differentiate one language from another\u2014even before they understand any of the given languages. Besides encouraging our multilingual parental undertaking, learning about this superpower afforded me a new perspective on strategies for future machine learning. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">When a team I was part of at R\/GA was working on an AI project, we found that for many users, one critical point of frustration was when speech-recognition algorithms failed to understand a user\u2019s spoken words due to their accent. \u201cIt never really works for me,\u201d one such user, who spoke perfect English with a measure of Japanese accent, told us during an interview. \u201cI tried to train Siri how to pronounce my name, but it doesn&#8217;t get it. It only works with common English names.\u201d<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The reasons for this problem, which is of course well known to certain, large portions of consumer-electronics users, are not innate to computing, but rather the product of cultural and economic decisions informing programming approaches. One central issue (well detailed in <\/span><a href=\"https:\/\/www.wired.com\/2017\/03\/voice-is-the-next-big-platform-unless-you-have-an-accent\/#.13gysbqma\"><span style=\"font-weight: 400;\">this<\/span><\/a><span style=\"font-weight: 400;\"> informative article) is the expense of collecting data, a factor that contributes to certain key demographics taking priority, and an AI voice devoid of any identity or accent beyond the \u201cmainstream.\u201d<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Could examining infants\u2019 linguistic acquisition help us tackle this problem? In her book <\/span><i><span style=\"font-weight: 400;\">Early Language Acquisition<\/span><\/i><span style=\"font-weight: 400;\">, Patricia Kuhl describes the innate linguistic flexibility exhibited by small children. \u201cInfants can discriminate among virtually all the phonetic units used in languages, whereas adults cannot. Infants can discriminate subtle acoustic differences from birth, and this ability is essential for the acquisition of language. Infants are prepared to discern differences between phonetic contrasts in any natural languages.\u201d This early neural plasticity is pivotal to an infant\u2019s future learnings, laying out the basic elements of language command. <\/span><\/p>\n<p><span style=\"font-weight: 400;\"><i>The universal language timeline of speech-perception and speech-production development. This figure shows the changes that occur in speech perception and production in typically developing human infants during their first year of life. <\/i>(Kuhl, <i>Early Language Acquisition<\/i>, p. 832)<\/span><\/p>\n<p><span style=\"font-weight: 400;\">When people talk to infants, they often adopt a certain \u2018baby-talking\u2019 tone of voice. This tone is remarkably similar all over the world. Evidence suggests that this style of speech facilitates infants\u2019 ability to learn basic codes of speech. We know how to help our babies learn. With this type of \u201cdeep teaching\u201d in mind, I wonder\u2014what would our machine-learning programs look like if we approached the task of programming them in a way that more closely resembled that of how we help our babies learn? If we started by teaching computers the most basic communication skills, ones that non-human mammals use, and then proceed to teach them skills that are unique to humans, such as statistical learning? AIs might have very different capabilities and potential for growth.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">During speech recognition-based product development, designers are rarely in the position to affect the programming of the voice being generated, but often do impact the user experience as it relates to the conversation. Designing accent-flexible AI algorithms might start by modeling them to learn in ways more similar to the way human children do. The silence after the first greeting phrase uttered by a Siri-like device should be receptive to different styles of speech used by different people. Designing for phonetic flexibility\u2014rather than for a predetermined \u201ccommon\u201d speech pattern\u2014would allow the relationship between a user and a machine to tighten in an organic way.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Designers would be wise to learn from the flexibility built into babies, and ingrain in machines\u2019 learning programs some affordance for adoption of different accents and intonations. We need to start by giving computers the ability that infants have, not the ability that old scholars and engineers have.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Theory of mind takes into account that when speaking with another person, one innately expects one\u2019s partner to take some notice of your background, and make allowances for it. Most individuals will allow, in some way, the words and cadence they employ to shift in response to their partner in discourse. Machines are not there yet, but that is the place we must aim for in order to create situations where talking with a machine will feel more natural.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The flexibility needed for good conversation is not only about phonetics, of course.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Most people would not have much tolerance for a conversation partner who regularly ended a conversation by saying \u201cI don\u2019t know\u201d and then immediately walking away. Yet our \u2018smart\u2019 machines cut off conversations all the time. Likewise, anyone conversing with an AI will quickly realize the importance of designing for an engaging conversation loop, rather than programming the machine to simply say \u201cSorry.\u201d Incorporating emotional intelligence as well as\u00a0<\/span><span style=\"font-weight: 400;\">emotional quotient (EQ) in AI is the next challenge in our \u201cIntelligent Age.\u201d<\/span><\/p>\n<p><span style=\"font-weight: 400;\">My son was eleven months old when we brought an Amazon Echo (AKA Alexa) home. I worried that due to its disembodied voice, the machine might confuse my child. I was relieved that this was not the case. I believe Alexa\u2019s gentle light and sound helped make my son smile and accept the responses uttered. Such cues can help generate conducive human\/machine interaction moments. But even with this in mind, much more work needs to be done in the field of AI, especially in the areas of human cues and natural language.<\/span><\/p>\n<h1><span style=\"font-weight: 400;\">Learning From learning<\/span><\/h1>\n<p><span style=\"font-weight: 400;\">I recently wrote down in my diary that my son gave a big kiss to \u201cmong mong.\u201d Mong mong is a Korean phonetic barking sound of a dog. After fifteen months of living on this earth, equipped with our human pattern detection and computational abilities, he has identified a dog, is able to associate a Korean sound to it, and has built a relationship with an object\u2014a doll\u2014that symbolizes a dog. I propose that UX developers would benefit greatly from taking time away from the digital, tapping in to our innate power of observing the people around us and how they learn and construct the world. <\/span><\/p>\n<p><span style=\"font-weight: 400;\">I learn from my son\u2019s learning. I am constantly surprised by how much I learn about UX design by following the growth of my son. And at the same time, my belief in the unique power of designers is strengthened as I realize how my training as an interface designer helps me closely examine and make sense of the ways in which my son learns to navigate and make sense of the world around him.<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>She meant to take a break from her job as a designer, but maternity leave turned into a hands-on UX design lab.<\/p>\n","protected":false},"author":1,"featured_media":878,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[3],"tags":[],"class_list":["post-129","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-issue-1"],"_links":{"self":[{"href":"https:\/\/itp.nyu.edu\/adjacent\/wp-json\/wp\/v2\/posts\/129"}],"collection":[{"href":"https:\/\/itp.nyu.edu\/adjacent\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/itp.nyu.edu\/adjacent\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/itp.nyu.edu\/adjacent\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/itp.nyu.edu\/adjacent\/wp-json\/wp\/v2\/comments?post=129"}],"version-history":[{"count":0,"href":"https:\/\/itp.nyu.edu\/adjacent\/wp-json\/wp\/v2\/posts\/129\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/itp.nyu.edu\/adjacent\/wp-json\/"}],"wp:attachment":[{"href":"https:\/\/itp.nyu.edu\/adjacent\/wp-json\/wp\/v2\/media?parent=129"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/itp.nyu.edu\/adjacent\/wp-json\/wp\/v2\/categories?post=129"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/itp.nyu.edu\/adjacent\/wp-json\/wp\/v2\/tags?post=129"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}