Someone at work asked me this question- “Are Good Programmers also Good Mathematicians?”. This question got me into thinking about all the stuff that people are doing in most of the IT companies. They call themselves programmers, but are they
In this article, I’m going to get into deeper level and talk about the software craftmanship and clean coding standards. So hopefully at the end you will get the answer to the question- “Are Good Programmers Good Mathematicians?”.
Let’s take a walk down the history lane to gain some insights.
Invention Of Computer Age
In the year 1936, Alan Turing published a paper where he talked about a machine that would be capable of solving any problem. This machine could be instructed by means of code and will perform any task based on the instructions written by programmers.
Sounds like modern day computing, Isn’t it?
Well, this was the start of the era for computer programmers.
Alan Turing was a great mathematician of his time who invented the universal turing machine.
How Does Turing Machine Works?
This was an abstract machine with limitless memory that will hold both data and instructions.
It consists of a scanner that will go back and forth to read each symbol from the memory and alter the symbols based on some program/algorithm. And when the machine halts, what remains in the memory is the end result of the computation based on the given algorithm.
Nobody that time knew that the idea of Turing machine will grow to become the basis of modern computing. But it Did.
Everything that could be solved by our modern day computers could also be solved by turing machine.
If you want to experience programming in Turing language then click on the link below:
The Beginning Of Programming
Automatic Computing Machine (ACE) was the very first electronic machine that behaved differently on different instructions. Much like our modern day computing.
The difference was – ACE was built with Vaccumm Tubes.
You can say that Vaccumm Tubes are the predecessors of our modern transistors.
ACE was built with 1450 Vaccumm Tubes. That made ACE huge monstrous machine that covered the entire hall.
You can look at the ACE in the image below:
Memory was the Biggest Concern
ACE used 12 Memory Delay Lines as its main memory. Each memory line could store upto 32 bits of data or 32 instructions and words.
Quick question: How many bits you have in your mobile phone’s memory?
3.2e+10bits in my mobile (only RAM) i.e. 4 GB. And still I feel I need more memory.
Just imagine, how quickly did we grew up to this level.
There was a time when 32 bits memory was enough for the developers to write code and perform complex operation and get the result.
Developers in the past knew the value of memory, thus they would keep the memory usage to the minimal by devicing new algorithms that we still use today.
You should learn from experienced programmers, they are the reason for our existence. You and I use languages that was created by them. Those languages are so successful because it was created by people who understood and valued the technology they were dealing.
They did not have the Computer Science degree, they were engineers who became programmers. They all were great mathematicians.
Is it Necessary To Be Good In Mathematics To Become a Great Programmer?
Coming straight to the point – YES.
You need to have good mathematics skills in order to become a great programmer.
The so-called programmers today are no-where close to being called
They are simply the users.
A programmer is an inventor. A creator. Who makes the thing. Who creates software that others can use. And to create such software, you require a good knowledge of mathematics.
Because mathematics allows you to formulate algorithms and algorithms is the backbone of every great program ever written.
Will you be able to create a software that will take training phrases from people and on the basis of training phrases decide which piece of code to execute?
If yes then do send me a piece of your code , I would love to go through it.
But in doing so, you will understand the need
Do you know Linus Torvalds?
Linus Benedict Torvalds is a FinnishâAmerican software engineer who is the creator, and historically, the principal developer of the Linux kernel, which became the kernel for operating systems such as the Linux operating system, Android, and Chrome OS.
Not only that, Linus Torvalds is also a creator behind Github. He created Github while developing the Linux kernel. But the Github he created was not very beautiful and user-friendly. You needed to be a Linux Kernal hacker to use it back then :/
But then a community of developers came forward and gave Github a nice web appearance and access to users for storing and maintaining their code.
No doubt he is a genius. But all this genius didn’t come from the open sky. He is great at mathematics.
In the computer world, no matter how creative you are at
Creating the world’s most popular and widely used operating system is not a piece of cake if you do not know how to feed your code into a machine in the best and most optimized way possible. And Linus Torvalds is a genius who made it possible.
Programming is hard if you take out the abstraction
Computer works in instructions. And instructions are well-defined set of computations ordered in a certain way (called algorithm) to achieve the end result.
What will happen if you say to a computer
add a and b where a is 5? and b is 5.
You will have to feed it the entire thing.
Let me give you a high-level understanding of what goes behind.
At the basic level a computer is just a one big complicated electronic device that works on the basis of electrical signals.
More preciously it works on the basis of electrical switching.
This switching takes place at a very high speed.
The i7 processor is 3.7 GHz and is capable of running at 4.7 GHz speed. It means it can perform
3,700,000,000operations in a second.
Can you imagine?
That’s how fast a computer is… And we use this speed to perform useful calculations, more preciously logical calculation.
Coming Back to the Point
So coming back, to perform a simple addition, the below logic is used.
Here, A and B are input bits. So, the truth table for the above would be like:
This is a simple one-bit adder. To add a bigger number you need more bits. Let’s assume you have 4 bits to perform the operation. So, if you want to increment a given number by one, you will need a combination of these Half Adders and an extra increment bit.
Let me quickly draw all the schematics required to demonstrate its working.
Below is the table of all the digits that can come with 4 bits.
I will implement this logic with Logic Gates and will see if everything works as per the plan or not.
Binary Coded Decimals
I have designed a very basic incrementer with the help of 4 Half Adders in series to work with
I have 4 bits for my input names as A0, A1, A2 and A3. Each bit could be in either of the two states e.g. Switch ON or Switch OFF. Either 1 or 0. So, the combination of these 1s and 0s I will represent a number from the above table.
Also, there is one increment bit at the top left. When that bit is switched on, the output of the circuit would be 1 plus the inputs.
Let’s see it in action, let’s take a number 3 and increment it by one.
To create number from 4 bits, we will have to set our input bits as
A0=1, A1=1, A2=0, A3=0
And switch on the increment bit.
So our expected output is: 0011 + 0001 = 0100;// 4
Let’s see what we get:
Now, you know what logic is required to add one at the machine level. And you can also see that all this is pure mathematics. It is called Boolean Algebra. So, programmer owe to be good mathematicians.
But, we have evolved from those days and created a verb for every such logics. So today, with programming languages, you can do a similar thing by writing this:
a=3; a++; // increments 3 by 1
Thank god our ancestors abstracted such complexity from us and gave us a simple language that sounds pretty much like plain english. But that doesn’t mean you forget the complex logics and calculation that goes beneath it.
Why am I telling you all this?
The main purpose of all this is to show you what actually goes in the making of all those programming languages and compilers that most of the poeple take as granted.
It is true you do not have to be a scholar in mathematics to program today, but to actually invent something that would be better at solving people’s day to day problem, I tell you, you will have to do a lot more than what you are doing now.
You will have to learn mathematics and how to materialize the creativity into computer in pure unambiguous logic.
Then you would call yourself a true programmer.
Programming in today’s world is easy. But to become a good programmer takes a lot of knowledge. Now this knowledge is not limited to programming languages but how programming languages are created.?
You must know what’s going on at the minute level. Because until and unless you know how things work that works, you would never be able to enjoy it at a level where an expert does.
That’s why I always recommend new programmers to get their hands dirty in C programming. Because that language forces you to understand every line of code that you do from a minute level and then grow from their.
If you can program in C, you can program in any other language in this world. Afterall, every language has come from C.
Try to create a programming language of your own. You may choose any language to do this task.
Make sure you follow the best practices.
Let me know your thoughts, views, comments on this. I would love to hear from you.
Do share and comment.