A sample text widget

Etiam pulvinar consectetur dolor sed malesuada. Ut convallis euismod dolor nec pretium. Nunc ut tristique massa.

Nam sodales mi vitae dolor ullamcorper et vulputate enim accumsan. Morbi orci magna, tincidunt vitae molestie nec, molestie at mi. Nulla nulla lorem, suscipit in posuere in, interdum non magna.

CLick here to expand all course descriptions

Posts by (1)

Code: The Hidden Language Behind Everything We’re Doing at ITP.

I’ve seen that two other people have already written about this book (one being my applications partner, which makes this post clearly very late but I wanted to finish the book), and though they’ve written wonderful things about the subject, I wanted to update what they’ve said with a concept that I took from the book, which, I feel, is the magic behind computers. That concept is Bits (which either in life or in ICM or PCOMP or in this book, we’ve understood as a binary digit). What is a Binary Digit? What is a digit? Why do we choose ten rather than 8, rather than 32 or 95?

In the beginning of the book, he talks about communication. How, in certain communication, the best understanding we can share are two values, on or off. Morse Code and Braille both use these concepts over and over. Putting an on and an off in sequence with other on’s and off’s create a code that can manipulate information in a way that can be understandable. But is on and off the only way that we can learn? No. What about changing the length of on or off? That will give us different values like Hannah talks about with morse code and logic. How can you manipulate on and off to give you different things as Henry says about electronics. I don’t think you can truly know withough as Henry says, “Knowing what you’re working with.” That is why understanding the bit and what the computer does with it is the most important issue of not only being able to artistically manipulate a computer but also being able to communicate with computers. The bit is the lowest value of information, on or off. This is the infrastructure of computers.

Charles Putzold begins the discussion of bits talking about base ten, our number system. He says that we clearly count in base ten because of how many fingers we have. He said that one of the most significant mathematical discovery came with the discovery of zero. That made it possible to think about the nothing integer or the reset digit. The interesting part comes when we are counting up and we run out of fingers, at 9. What did we figure out? to reuse the digits and make 10. which is just starting the counting over but holding a remainder to remember that we’ve already been through the digits once. This is super important because as we’ll see this is the way to count in other more complicated number sequences.

Next he talks about what if we weren’t humans but rather what if we were cartoon characters. We’d only have 8 fingers and toes. Then we wouldn’t have a use or a reason to count to 8 or 9. We’d count to 7 and then 8 would turn into 10. Think about it. 8 would be the reset. 1, 2, 3, 4, 5, 6, 7, 10, 11, 12, 13, 14, 15, 16, 17, 20 etc. It doesn’t stop there. What if we were all crabs. Then our hands would be claws and we’d count in base 4. 1, 2, 3, 10, 11, 12, 13, 20 and so on. And it would be the same if we were dolphins. Dolphins have two flippers and therefore would count in base two, binary, ones and zeros. 1, 10, 11, 100, 101, 110, 111, 1000, 1001, 1010, 1011, 1100, 1101, 1110, 1111. That is the first 16 numbers in binary. It’s very simple. You count 1. Then instead of 2 it would be 10 because you reset the count since there’s only a 1 and a 0 in your world. Don’t understand. Think back to when were counting base 8. We had no use for the number 8 just like when we count in base 10 we have no digit higher than 9. We get to the last digit and start over storing a remainder. That’s all base ten is. It just constantly stores remainders as you count up. look at the sequence of binary numbers counting up to 16 base ten. notice how everytime it reaches what would be two in our base ten. It moves over a number storing the reset. 1, 10 (stored), 11, 100(stored), 101, 110(stored), 111, 1000(stored). This is how a computer works. It’s constantly storing all of the counts and manipulating them to give different values. That’s also how electricity works. It’s on or it’s off, 1 or 0. And chips that have switches (on or off) store this like memory. It’s a little more complicated the way chips are made to store electrical impulses and quite beyond the topic but I really wanted to uncover the magic of counting in base two as Charles Putzold did for me. I hope you enjoyed it as much as I did. Let me know if I didn’t make sense.

10 comments to Code: The Hidden Language Behind Everything We’re Doing at ITP.

  • William Lindmeier

    That’s a clever way of describing the binary number system. Even though I’ve had an abstract understanding of how computers work, ITP has really opened up my eyes to the fundamentals of pushing electricity around to make data. And now for my 1,000th word: 01010000011011110110111101110000

  • William Lindmeier


  • gs1754

    This is a great explanation! I agree that we must know bits and how to manipulate them, but sometimes when you so stuck at something, you forget that there is another way. Correct, all our technology is based on bits, on 0’s and 1’s, on ON and OFF, HIGH and LOW. From the early days of computation we were thinking in that direction and developing our technology towards the Binary system. But is it the only way to do it? Can we find another way? What if instead of electricity we were using some other energy? Some other material to transfer the data? What if we made our technical equipment and devices in a way that they communicate in base 3? Three states instead of two, or maybe 1000? Or maybe we would come up with a system that communicates in structures? Maybe we could by now settle on another way-better planet and drink Quahily which is the most cool drink on that planet. Think about it…

    In the rest of this reply I would like to try and explain Hexadecimal base system as a continuation to the Binary explanation to those of you who struggle with it.

    So, by the time our scientist came up with this genius binary system, they soon realized that it is very difficult for human beings to read those numbers. The problem was that they are so long so you just can’t remember them. How would you remember the number 1100100101110001? How about 1011000? Try that. In order to remember them you need to bring them to their Decimal (base-10) representations (1011000 = 88), which takes time to do, and is annoying.
    They needed a way to be able to transfer binary numbers into memorable representation without using calculators (because they were just inventing those), and without spending too much paper with writing down everything in binary. So they came up with another wonderful way of counting, the Hexadecimal system. Have you seen 0x1E? 0xB7? Are you scared? Don’t be, its easy.
    Those scientists chose base 16 (16 different digits going from 0 to 15) because it has a nice relationship with base 2 – which is 2^4 = 16, which means in number theory that 4 digits in a Binary number can always fit into one digit of Hexadecimal number – 4 times shorter numbers to remember and also a quick way to translate from Binary.
    They needed a way to count up to 16 with one digit, and they realized that they don’t have enough digits, so they took the quick route and just used the English alphabet to fill up the gap. So, they were counting 0,1,2,3,4,5,6,7,8,9,A,B,C,D,E,F. Brilliant!

    So I hope you understand how it’s making the numbers easy to remember (4 times shorter), but how will it make it easy for them to translate from Binary? Well, every four Binary digit translate into one Hexadecimal digit. Four binary digits can make 16 different representations going from 0 to F(15) (yeah, like a Hexadecimal digit). 0000 = 0 to 1111 = F. This way you only need to count in binary like Charles Putzold suggests up to 15 (F), and you eventually memorize some of them.

    So for example, if we have the number 01010111 we break it into groups of four:
    0101 1010
    We then translate each group individually.
    0101=5 1010=A
    We put the digits together and we have our number: 0x5A
    So what is it about that “0x” thing at the start? This is just: “Beware! a Hexadecimal number is coming”, a way to tell the programming language that you mean in Hexadecimal, and also to prevent crappy programmers from touching your code.

    If we go back to that huge number at the beginning we will see that it is fairly easy to remember in Hexadecimal.


    1100 1001 0111 0001
    C 9 7 1

    and our enormous number is down to 0xC971.

    Hope you understand, though reading it all at once now.. I’m just not sure this explanation will work. But if you got that far, thank you.

  • Nancy

    Thank you both for all of these explanation….
    and Bill congratulations on 1000 words.. worth a picture⭐

  • wl379

    Wow, thanks so much for the straightforward explanations. Hexadecimal always seemed so daunting, it’s nice to hear it broken down like that. It almost makes sense now. It reminds me of the time I learned how an abacus worked. There was a great “aha!” moment when I was actually able to see the calculations happen as an intuitive process and not behind the scenes (which is how it felt trying to watch someone use one, those guys are fast).
    Ryan, excellent job explaining Petzold’s animals analogy as a means of addressing binary. That humans count in Base 10 makes sense when you look at your hands. The point made me stop and consider how many mathematical concepts have been lost on me over the years because I haven’t been able to identify with them in this sort of non-abstract way. Seeing and playing with an abacus, for example, allowed me to understand how it worked better than any explanation had in the past. The physicality of manipulating objects is much like wiggling the fingers on your hand. You get the concept because it’s something you identify with, something you can experience as a human. The animal analogy works so well for me along those same lines, because it still engages some sort of empathetic response. I’m a crab, cool! I would be looking at my wriggling crab claws with the same “this makes sense!” feeling. I am not sure what it says about me that going from crab to dolphin made it easier for me to understand binary as a counting system (I really like the sea?), but it was successful. Thanks!

  • rtb288

    That was a wonderful explanation of hexadecimal. What’s really amazing about these different ways of counting is when you realize everything your computer is doing is complex math problems with large portions of numbers and even then it’s still just adding stuff together and giving results. The image on your screen is just numbers. The reason that I bring this up is that I had a break through about a month ago with a computer science problem.

    Often when I was working on a project, I wouldn’t do something because I didn’t want my program to break. We probably all go through this feeling at some point. On my current project, I knew that I needed to get a PGraphic from my main class int a subclass. I couldn’t think of how to do it until I realized that I wasn’t sending a graphic. I was sending a number. The PGraphic isn’t a physical graphic just like PImage isn’t a physical picture. It’s a bunch of numbers strung together which are sent to the screen and decoded into color and size values that are then visible. I realized sending it to the first subclass was just like sending a number. Then sending it to another class was easy. Just send the number add other numbers to it and send it back to the main class or anywhere for that matter. That level of abstraction was completely beyond my scope until I had to cross it to get what I knew I wanted and I knew was possible. So next time you feel worried about trying something. Just remember these things are only numbers and you can manipulate it all like you manipulate an “int”. Once you understand that, you’ll be great.

  • Todd Bryant

    Of course you would end up with 1000 words exactly.

  • Surya

    I think thats a really good way of looking at things because they really are just numbers. Whats interesting about the way you fixed that problem is that you broke it down to a basic unit that is a lot less intimidating than something called PGraphic. I remember I had a similar revelation when I first started writing code. It was during my thesis, I was trying to make a heart monitor for premature babies that would pick up their heart rate value easily without worrying too much about noise (which is a pretty well documented problem when working on medical instrumentation for neonates). When I first started this project I was extremely worried, although I understood the theory behind what I was trying to do I had no idea how I was going to code it. Over the course of this project I learned a lesson which has always stuck with me, the trick to solving any problem is asking the right question.

    When I first started working on the program I had to initialise serial communications between my PIC micro-controller and the PCs hyper-terminal and I assure you it took a lot more than a Serial.begin(9600). Initially this task seemed very daunting because I think I was overwhelmed by the complexity of all the things I was trying to do. However, over time I realised that even though I might be trying to make something relatively complex you can normally isolate a problem to a basic component of the system. With practice, I started seeking this simple component whenever I would get stuck. An unexpected result of doing so was that the question I had would become very specific and as a result very easy to google.

    The more time that I spend in this field the more I’ve realised that it is really important to break complex problems down to a level where you can grasp them. This I feel is a fundamental building block for working with computers and once you master how to break a problem down, it not only helps debug code but also makes it easier to write code. And most importantly it makes computers and coding seem less scary. Its possible that this might just be my journey and other people have had different routes but I still fell there is some truth here. And in the world of the stack overflows and quoras nothing is given more attention than a well formed question.

  • rtb288

    I totally agree. The best thing that I learned in ITP this semester was “Pseudo-Code”. It’s such an easy thing to do and far too often overlooked as something to do while coding rather than before. However, it’s such an important part of creating a program that will do something, solve some problem or track how many times the word “hello” is said on Twitter. It all starts with, “I want to track how many times the word ‘hello’ is said on Twitter”. That is your program but so many people would just say, “OK, let’s jump into the Twitter API and start coding”, which isn’t always a bad thing. For instance, if Dan Schiffman wanted to figure this out, he’d clearly start coding and use his already established OAuth key to get that data. But even he would have to step back and say what do I want to do with this data. Maybe he’d like to iterate a text variable with the number or maybe he’d like to figure out how many times it’s said on the second Tuesday of every February. As you start coding, more problems pop up like how to display data or how to save or how to this or that. That’s why it’s so important for people to stop with their idea, take a step back and ask themselves what they want from the data. Even taking five minutes to write out a paragraph of what they’re code will do will make their lives so much better. For instance…

    …I want to make a Twitter program that decides how many times the word hello was written on the second Tuesday of every April (right here I have so much information for how to write this program…I know that I’m going to use the Twitter API, which means that I have to get it and sign up for it. Then I’ll probably get some examples with it. I also know that I’m going to search twitter for the word “hello”. I know that I need to find a date function in the api, something that will let me look up specific dates. Then I can find, in Java’s VAST class libraries by googling “how do I pick specific dates” or something like it, a date function to produce the exact dates that I need (specifically what dates the second Tuesday falls on in April since the beginning of Twitter). Then I can start to iterate the overall value once those things are in place. That is the first sentence and it exploded into a lot of logical steps towards the goal. But still what is the goal. Well we should define the goal.) I want to make a door open every time that the word “hello” is said on twitter. (This opens up a whole other set of logistical problems. Do I want the door to be virtual or real? Does the door open every time all at once? Does it stay open for the amount of time that hello adds up to being in seconds? Do we show the post that had the word in it? If the door is virtual, does the text of the tweet fly through it, which would end up determining how long the door is open, the amount of time it takes to fly through the door reasonably. All of these questions and more define what your program is.)

    As you can see, two sentences posed many questions and many ideas or choices. That is what programming is to me, a set of distinct choices or dare I say it Bias (more on this in a second). Not every program is going to use the Twitter API, only the ones that need the Twitter API. That seems like a stupid statement but there are so many ways to get information that you need. The people who wrote the Twitter API didn’t have the Twitter API to reference, they wrote it, which means the techniques and information is out there. You just have to figure out what you need and how to get it and that info is out there and everywhere.

    Before I leave, I’d like to speak about Bias as it was probably the second most interesting thing I learned this year. Many of you know Kyle McDonald. He did his thesis on the MP3 architecture and the history of it’s development. It doesn’t sound that interesting but what he told us about the history was fascinating. MP3 was created in the late 80’s by a young man doing his dissertation for school. The reason he built it was to figure out a way that scientists could send info over ISDN lines faster. It works on a compression algorithm that chops out the “unneeded” parts of sound, the really low parts or the high parts, etc. The first thing he did was make a choice. He decided that humans can’t really hear above 16khz. So he deleted it. This choice, in my opinion, had the largest effect on music ever. Think about it. We have all benefited largely from this change but now humans don’t hear those sounds any longer. Now most of us can’t hear above that range but you see what I’m getting at. A man in a lab trying to develop something that his peers would enjoy and laud him for destroyed an entire range of sonic representation for as far as I can see humankind. He clearly had no idea that his scientific breakthrough would have such far reaching effects both inside the music industry and outside but it did.

    I’ll leave you with two other quick gems from Kyle. 1) In the original MP3 specification, there was a byte set aside for “copyright”. hahaha 2) In the original ID3 tags for iTunes, there’s a genre called Primus. Just a couple of nerds sitting in a room coding little jokes to themselves having no idea the implications. Check out the rest for a final’s time laugh…

  • Gal

    It’s wonderful to see your enthusiasm towards programming and computers in general. But beware of becoming a machine like human being! I’ve been to Goa, India, taking a Yoga class there on the beach, the instructor said to me, “Try doing this less as a machine…” Thinking too much like the computer processor tends to make you one. Believe me, I’ve been doing this for more that 20 years. I got sucked into programming when I was 9 years old, reading a geek’s magazine. I love the creative process in programming. I like to have a problem which I invented and then solve it.
    I totally agree to what Surya was saying: “The more time that I spend in this field the more I’ve realised that it is really important to break complex problems down to a level where you can grasp them…”. More than that, I find it easy when you make your program compile and work along the process, solving small problems, and see they work together. Sometimes its not possible, but if it is its very helpful.
    For example in the twitter example, I would write a function that accepts the word and returns how many times it appeared, then I can see that portion working by just writing the results to the console. After that I can work on the visualization and at the end just connect to two.
    The impact that programmers have on this world is overwhelming. I is interesting to think about adobe illustrator as an example. Most of the designs we see today looks vectory, allot of clean lines, simple shapes. Does the software that we develop effects our stylistic taste, or the other way around??