The real cost of programs

I have often been ask why computers are so slow, and I have even questioned that myself. Frequently I have forgotten what it is, to actually see what the computer has to do to produce the pretty graphical user interfaces we have grown to love. So in my infinite (limited) wisdom I have revisited the problem.

What we see in our programs is often the construction of little works mounted upon the shoulders of vast giants. So to strip away the vast GUI which is Windows xx or Gnome/KDE X-Windows or Mac-OS we get back to what was considered the hight of computing, the terminal prompt and the super sophisticated programming language C!

In any world of programming there is a thing, that always needs doing, and that is to say “Hello World!” to your new environment. And therefore this is as good a place to begin. Write a ‘Hello World’ program.

With the simplest of editors (Ive chosen VI here) you can type in the following lines:

vi helloworld.c
#include <stdlib.h>
 #include <stdio.h>
int main(void)
 printf("Hello World!\n");

and then save the program.

Then at the command line (in this case Linux)

gcc helloworld.c

and if you have typed everything correctly you should be able to run your fabulous program

./a.out ( as we did not give our program an ‘output’ name, it always defaults to using a.out)

and we get

Hello World!

great you have now entered the world of programming.

But what did the computer see? What ‘things’ did it have to do to make this work. If all it did was write back to the screen the same characters I wrote, that should be simple. So what does it do?

Well to start with it has to convert that program, the simple commands you typed in, into something it understands, and that is a very simple language, however it does it in two stages. The first stage is into a mnemonic language called Assembly language. And it looks like this;


.file    "helloworld.c"
 .section .rodata 
 .string "Hello World!" 
 .globl main 
 .type main, @function 
 pushq %rbp 
 .cfi_def_cfa_offset 16 
 .cfi_offset 6, -16 
 movq %rsp, %rbp 
 .cfi_def_cfa_register 6 
 movl $.LC0, %edi 
 call puts 
 movl $0, %eax 
 popq %rbp 
 .cfi_def_cfa 7, 8 
 .size main, .-main 
 .ident "GCC: (Ubuntu/Linaro 4.6.1-9ubuntu3) 4.6.1" 
 .section .note.GNU-stack,"",@progbits

and while this is almost as readable as the little ‘c’ program you typed in, it uses a more detailed description of steps it must take to produce the code that it needs to actually do something.

The next step converts the mnemonic symbols into a numeric language, the real language of computers, called binary, however binary for humans it very hard to read or understand so I’ve taken the binary and ‘disassembled’ it into a hexadecimal number interpretation of the machine language that we can, more or less read. And that looks like this;


Now, the first thing that will strike you is that, WOW!, that program got big, but there is more here than just your program, there are the language inclusions of your program. Lets review :

#include <stdlib.h>
#include <stdio.h>
int main(void)
printf("Hello World!\n");

If you didn’t understand what you typed before, you wouldn’t be alone, as there are two lines at the beginning that are “Including” things into my program. These are libraries, extensions you might say, to your program that help the computer understand what it needs to do, to do what you want.






Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.