Tag Archives: Computing

Future Computing

Note, the computer is not built to do anything other than execute instructions, hardware advances over the years have only advanced the ability of the CPU to gather instructions, it does not make decisions about what to execute, or in what order to execute them in. That is the organization of the basic boot loader, in combination with the operating system loaded.

There are no elements of artificial intelligence built into the hardware, it has no ability to reprogram itself or to change it’s wiring. External forces must be applied to force change either my altering microcode-code in the core of the CPU (should that be possible) or by execution of programs within the confines of the operating system, instructions provided by the boot loader or via operating systems loaded and executing programs. It is through those processes that constitute what a computer does, with what it ‘sees’ .

Any hope of producing the next generation of computing must therefore be a revolution in how the CPU is instructed to perform it’s instructions, what is done with the output, and any associated hardware connected to the system to perform ‘tasks’ assigned by that process. The argument that Windows, or Linux/Unix or any other operating system is better than another DOES create opportunities and restrictions uniquely to any new programming, computing Paradigm.

Anything like artificial intelligence will have to preceded by a new suite of hardware, with a new way of ‘booting’ the system and or an entirely new operating system tailored to artificial intelligence operations. Current hardware/software standardization is at once the primary blockage to any future advances to computing.

The real cost of programs

I have often been ask why computers are so slow, and I have even questioned that myself. Frequently I have forgotten what it is, to actually see what the computer has to do to produce the pretty graphical user interfaces we have grown to love. So in my infinite (limited) wisdom I have revisited the problem.

What we see in our programs is often the construction of little works mounted upon the shoulders of vast giants. So to strip away the vast GUI which is Windows xx or Gnome/KDE X-Windows or Mac-OS we get back to what was considered the hight of computing, the terminal prompt and the super sophisticated programming language C!

In any world of programming there is a thing, that always needs doing, and that is to say “Hello World!” to your new environment. And therefore this is as good a place to begin. Write a ‘Hello World’ program.

With the simplest of editors (Ive chosen VI here) you can type in the following lines:

vi helloworld.c
#include <stdlib.h>
 #include <stdio.h>
int main(void)
 printf("Hello World!\n");

and then save the program.

Then at the command line (in this case Linux)

gcc helloworld.c

and if you have typed everything correctly you should be able to run your fabulous program

./a.out ( as we did not give our program an ‘output’ name, it always defaults to using a.out)

and we get

Hello World!

great you have now entered the world of programming.

But what did the computer see? What ‘things’ did it have to do to make this work. If all it did was write back to the screen the same characters I wrote, that should be simple. So what does it do?

Well to start with it has to convert that program, the simple commands you typed in, into something it understands, and that is a very simple language, however it does it in two stages. The first stage is into a mnemonic language called Assembly language. And it looks like this;


.file    "helloworld.c"
 .section .rodata 
 .string "Hello World!" 
 .globl main 
 .type main, @function 
 pushq %rbp 
 .cfi_def_cfa_offset 16 
 .cfi_offset 6, -16 
 movq %rsp, %rbp 
 .cfi_def_cfa_register 6 
 movl $.LC0, %edi 
 call puts 
 movl $0, %eax 
 popq %rbp 
 .cfi_def_cfa 7, 8 
 .size main, .-main 
 .ident "GCC: (Ubuntu/Linaro 4.6.1-9ubuntu3) 4.6.1" 
 .section .note.GNU-stack,"",@progbits

and while this is almost as readable as the little ‘c’ program you typed in, it uses a more detailed description of steps it must take to produce the code that it needs to actually do something.

The next step converts the mnemonic symbols into a numeric language, the real language of computers, called binary, however binary for humans it very hard to read or understand so I’ve taken the binary and ‘disassembled’ it into a hexadecimal number interpretation of the machine language that we can, more or less read. And that looks like this;


Now, the first thing that will strike you is that, WOW!, that program got big, but there is more here than just your program, there are the language inclusions of your program. Lets review :

#include <stdlib.h>
#include <stdio.h>
int main(void)
printf("Hello World!\n");

If you didn’t understand what you typed before, you wouldn’t be alone, as there are two lines at the beginning that are “Including” things into my program. These are libraries, extensions you might say, to your program that help the computer understand what it needs to do, to do what you want.