Codementor Events

The Sum of Parts

Published Nov 24, 2018

I wrote this article to explain to my non-techie friends what I do in my day to day job. I’m a senior programmer. My daily grind involves thinking about computers, programming computers, testing computers and swearing at computers. I wanted to explain what a program is and how it works but I ended up going to a very low level explaining about how you get from a bunch of wires and electronic signals to a fully functioning computer.

A computer is a black box which can be as small as a watch or as big as a filing cabinet. Inside it lives a gremlin that never sleeps, eats, or drinks but is capable of doing and knowing far more than you will ever be capable of. Just kidding - there is a gremlin but it doesn’t really “know” or “do” anything on its own: it’s just following a set pattern of predetermined instructions, the result of which is certainly greater than the sum of its parts.

Think about language and the way humans have decided and agreed upon an intricate system of complex meanings based on tiny pieces. We have agreed collectively that letters have a written representation and a sound when spoken. They’re pretty crap by themselves and don’t really get up to much. However you can combine letters into words which are actually quite useful. They let you use a simple sound to represent something else instead of having to point at it or mime it. But words have their limitations too because their meaning is easily confused without the appropriate context. What does ‘meat’ mean?

  • I want some meat?
  • Where is the meat?
  • I don’t like you and I will turn you into meat?

To get around this we agree upon grammatical structure and then arrange words to form sentences, allowing us to specify the context of our word usage, greatly reducing ambiguity and stopping us from eating each other. Letters, words, sentences, language. A fantastically versatile and intricately beautiful system can be built in gradual increments from tiny building blocks which themselves are nearly useless. Computers are just another example of a system of emergent behaviour like this.

At their most atomic level computers are made up of transistors which form logic gates. A transistor is a commonplace circuit component (like a switch, bulb, resistor or diode) which can amplify an electronic input to cause an electronic output - simply put, it is an electronic switch. Multiple transistors can be combined together in a circuit to form a logic gate. Logic gates convert the combination of multiple input signals into a single output which are said to be ‘on’ (1) or ‘off’ (0) in a binary fashion. An OR gate will output 1 if either (or both) of the inputs are 1, whereas an AND gate will output 1 only if both of the inputs are 1. There are many different kinds of logic gates and they are many times more useful when combined together in interesting and complicated ways. Chips (those black things with metal legs) contain pre-built circuits with logic gates arranged in a specific way to meet a dedicated task such as timing, motor control or voltage regulation.

Electronic signals in this low-level circuitry context can only be “on” or “off” - 1 or 0. This isn’t very helpful for us as humans. To be useful, a computer needs to represent more than 2 states. Human beings have managed this apparent limitation by inventing (or, more accurately, discovering) binary numbers as a way of representing numbers using only the 1 and 0 digit.

Humans commonly use the decimal system, which has numbers from 0 to 9. If you want to represent a number greater than 9 you need to add another ‘place’ and this is done by putting more numbers side by side. A 7 next to a 3 next to a 4 means 734 where the 7 represents the hundreds, 3 represents the tens and the 4 represents the units, or 7 100s, 3 10s and 4 1s. Did you notice in the decimal system that each ‘place’ is in increments of powers of 10: 100s to 10s to 1s? The underlying number of digits and ‘place’ increments is known as the ‘base’ of a number system - decimal is a base 10 number system and binary is a base 2 number system.

This process of ‘places’ having a predetermined magnitude works with binary digits too. 101 in binary is the equivalent of 5 in decimal. Binary numbers work the same way as decimal except each ‘place’ increments in powers of 2: 4s then 2s then 1s. Reading 101 from left to right we have 1x 4, 0x 2s and 1x 1.Why powers of 2? It is because you only have a 0 or a 1 in each ‘place’ (2 possible values), whereas in decimal you have 0 through to 9 in each ‘place’ (10 possible values). As an aside, Decimal 734 is 1011011110 in binary - 1x 512, 0x 256, 1x 128, 1x 64, 0x 32, 1x 16, 1x 8, 1x 4, 1x 2, 0x 1.

Good stuff if you are following so far, but how do abstract number representations help us invent Google or Call of Duty 37? Our circuits and logic gates still only represent 2 states, which is clearly not enough to do anything of use. We can get a lot more from our gates by chaining together multiple logic gates - using the output from one gate to serve as the input to the next. If you ascribe meaning to the placement of these gates then it is possible to have sub-circuits of gates that represent the ‘places’ of the binary number system.

To add 1 to 1 you can use an AND gate and state that its output means 2 (in decimal). Put another way, it asks the question “do I have 2?” and the answer is ‘yes’ if you have 1 AND 1, or ‘no’ in all other cases.

With a large number of these gates arranged and agreement made about which digit is represented by each gate you can create a simple circuit that adds up binary representations of numbers and outputs the correct answer (in another binary representation). By getting fancy with your gates and combinations you can also perform multiplications, subtraction and division. You could even arrange your gates in such a way to monitor an input which decides what kind of sum to perform - 0 for add, 1 for minus.

Notice how I’ve used the words “input” and “output” with deliberate vagueness. Inputs can come from anywhere - switches, sensors, even other outputs. By combining inputs and outputs of transistors we have made logic gates, and by combining the inputs and outputs of logic gates we have started to represent numbers, arithmetic and decision making. With careful planning and patience it is possible to imagine an arrangement of logic gates that can cope with very large and complicated sums. What if it performed a calculation and then used the output as the input of the next calculation? This would greatly increase the complexity and range of calculations available in our computing circuit.

“But this is all numbers and maths, I want to post cat pictures on my feed, this is LITERALLY useless to me!” I hear you screech. The great thing about numbers is that there’s loads of them. Too many to count even. They can also represent things that are less numbery.

If I told you to write a letter using only numbers, how would you do it? You can’t really, unless the person at the other end agrees what the numbers you send them will mean. This is just like our shared assumptions about language - small abstract symbols that have agreed meaning, which we compose into communication, instruction, meaning and emotion, but are completely and utterly pointless if the recipient does not share your intended meaning.

So how would one write a letter in numbers? Pick a code (or cipher) to use and make sure the recipient will decode your message using the agreed upon code. A pretty obvious code would be to use numbers to represent the position of a letter in the alphabet:1 = a, 2 = b, 3 = c etc. 8 5 12 12 15, 23 15 18 12 4!

We’ve just invented a way to represent letters using numbers. The computer doesn’t understand what these symbols mean any more than it knew the meaning of the numbers it was adding up. It is a system following a pre-programmed path with an agreed meaning on the inputs and outputs. We as humans are the ones who add the meaning. The computer has no understanding of meaning, no knowledge and doesn’t magically ‘do’ anything except trace a maze of logical decisions and aggregations at lightspeed to convert input into an output that we understand.

Have you noticed the pattern I’m conveying? Tiny bits such as letters or transistors can make slightly more useful things such as words or logic gates which can be combined into useful stuff such as sentences or calculators. The atomic parts by themselves are meaningless, but it is through a shared understanding and cooperative agreement that we can put meaning to abstraction and slowly build up incredibly powerful systems that in turn make us as a species greater than the sum of our parts. Screw you, gremlins.

Discover and read more posts from Andy Longhurst
get started
post commentsBe the first to share your opinion
Show more replies