Language of computing

I was reminded this Christmas Holiday season that computers do not ‘know‘ any human language, only binary, and that it takes humans to provide the translation from the machine to something human readable. And while most computer programing languages are ‘English’ like, they need not have to be in the English Language. It’s just what’s what happened first, and could be changed into another language at anytime.

This came to me in an inspired way, by listening to Carols, where non-native speakers were singing in latin, and other non-english speakers were singing in English, or German, or French. That you can sing in a language, and not know how to speak in it.

I suspect that is the same method that most non-english speakers program computers in ‘english like’ programing languages. By layering another translation over the programming, or like in singing, which uses another part of the brain, different from the part that provides language skills, another part of the brain is used to converse with computers. Thus making the point that people who program, do think with altered brains.

Future Computing

Note, the computer is not built to do anything other than execute instructions, hardware advances over the years have only advanced the ability of the CPU to gather instructions, it does not make decisions about what to execute, or in what order to execute them in. That is the organization of the basic boot loader, in combination with the operating system loaded.

There are no elements of artificial intelligence built into the hardware, it has no ability to reprogram itself or to change it’s wiring. External forces must be applied to force change either my altering microcode-code in the core of the CPU (should that be possible) or by execution of programs within the confines of the operating system, instructions provided by the boot loader or via operating systems loaded and executing programs. It is through those processes that constitute what a computer does, with what it ‘sees’ .

Any hope of producing the next generation of computing must therefore be a revolution in how the CPU is instructed to perform it’s instructions, what is done with the output, and any associated hardware connected to the system to perform ‘tasks’ assigned by that process. The argument that Windows, or Linux/Unix or any other operating system is better than another DOES create opportunities and restrictions uniquely to any new programming, computing Paradigm.

Anything like artificial intelligence will have to preceded by a new suite of hardware, with a new way of ‘booting’ the system and or an entirely new operating system tailored to artificial intelligence operations. Current hardware/software standardization is at once the primary blockage to any future advances to computing.

Learning Computers and Computing

I have been doing research into the nature of computers and I’ve been participating with with the phenomena know as Coderdojo. As part of my research I’ve been relearning Assembly language on several different architectures, and I’ve been experimenting with such things a the ELF Membership Card which I soldered myself and is currently running in front of me, along with a Arduino Uno. These both represent small microprocessor, very like the ones I personally started out on.

My first computer was an Apple II+ with a Motorola 6502 processor. But in any case, this act of relearning what a computer really is has made me aware of the lack of any real education ‘tools’ like I had. The sensation that is the Rasberry-Pi is fast becoming the CPU-du-jour of the developers, and as such may develop into a great educational tool. But, and there is always a but, it doesn’t stand on it’s own.

The group who developed the Pi have themselves noted that this is a developmental prototype, and that it needs to be distilled into a real educational product. It first need a keyboard, mouse, display a SD-Memory card and a power supply, to even turn it on. To make it useful as a net-workable it also needs a connection to hardwired Ethernet. It needs to have software preloaded onto the SD-Memory to boot properly. These are Geek requirements, anyone who can make this work, ALREADY has working knowledge and equipment, call it infrastructure, to make this work. What is missing in this is a standalone environment that is self contained and independent of both other systems, and other foreknowledge of computing.

My Apple II came with a keyboard, memory, built-in BASIC programing language, and displayed it’s output into a common Television, and recorded and loaded programs from a simple cassette player. All these elements were basic, everyday items in my household, and it would plug into the mains power directly, and display on a TV. It started up using Applesoft BASIC language and displayed on the screen everything I typed.

The Rasberry-Pi now needs this type of infrastructure. And while on this subject, and not to stir a pot, comes a language issue. The apple I learned on came with BASIC and in fact I still have a fondness for BASIC. The current arguments in the ‘Programming Education’ discussions are that a language like BASIC teaches BAD programming practice. Be in old, I had to remember the motivations of BASIC and was more enlightened to connect this with my reeducation about Assembly language. That was the first reason for BASIC! BASIC is and was engineered to, more or less, follow the structure of the instruction set of the CPU itself. Where language snobs see bad ‘GOTO’s in BASIC, I see machine language Conditional and unconditional ‘Branch’ instructions. Where I see a BASIC with line numbers (not all BASICs have them) I see ‘Linear’ machine instructions.

One element of the Rasberry-Pi that also misses the mark, is the nature of ‘abstraction’ while I admire the Python of the Pi, and the ‘C’ like language of the Arduino, what is missing is the distance between the learner programmer and the actual machine. It may even be a serious problem as the machine begins to look like magic, and that it can be made to do anything.

The programming of the RCA 1802 chip contained in the ELF Membership Card demonstrated what the creator of the card referred to as ‘Bare metal programing’. A simple program that I used to test the ELF with consisted of 12, 8 Bit instructions, writing (essentially) the same program for the Arduino required downloading of 998 8 Bit instructions (not including the 512 Bytes of the boot loader). To be sure there were probably a lot of libraries included in that download. Helpful, but masking the actual operations of the CPU from any real educational product. Just like that Arduino, the Rasberry-Pi will mask the CPU, and the associated hardware by a boot loader (BIOS), followed by a full, though striped to minimum, Linux kernel, and a GUI in the form of LXDE X-Windows, followed by Python language. That’s a lot of abstraction!

All these things may be irrelevant in the long term, one thing may lead to stimulation to explore the ‘Bare Metal’ hardware of the Rasberry-Pi while allowing a positive feedback with easy ‘wins’ on top of the abstraction provided. Still I believe we are missing an opportunity to produce the next generation of computer wizards. I also believe that someone needs to integrate the Rasberry-Pi into a OLPC type of device.

The real cost of programs

I have often been ask why computers are so slow, and I have even questioned that myself. Frequently I have forgotten what it is, to actually see what the computer has to do to produce the pretty graphical user interfaces we have grown to love. So in my infinite (limited) wisdom I have revisited the problem.

What we see in our programs is often the construction of little works mounted upon the shoulders of vast giants. So to strip away the vast GUI which is Windows xx or Gnome/KDE X-Windows or Mac-OS we get back to what was considered the hight of computing, the terminal prompt and the super sophisticated programming language C!

In any world of programming there is a thing, that always needs doing, and that is to say “Hello World!” to your new environment. And therefore this is as good a place to begin. Write a ‘Hello World’ program.

With the simplest of editors (Ive chosen VI here) you can type in the following lines:

vi helloworld.c
#include <stdlib.h>
 #include <stdio.h>
int main(void)
 {
 printf("Hello World!\n");
 return EXIT_SUCCESS;
 }

and then save the program.

Then at the command line (in this case Linux)

gcc helloworld.c

and if you have typed everything correctly you should be able to run your fabulous program

./a.out ( as we did not give our program an ‘output’ name, it always defaults to using a.out)

and we get

Hello World!

great you have now entered the world of programming.

But what did the computer see? What ‘things’ did it have to do to make this work. If all it did was write back to the screen the same characters I wrote, that should be simple. So what does it do?

Well to start with it has to convert that program, the simple commands you typed in, into something it understands, and that is a very simple language, however it does it in two stages. The first stage is into a mnemonic language called Assembly language. And it looks like this;

 

  
.file    "helloworld.c"
 
 .section .rodata 
.LC0: 
 .string "Hello World!" 
 .text 
 .globl main 
 .type main, @function 
main: 
.LFB0: 
 .cfi_startproc 
 pushq %rbp 
 .cfi_def_cfa_offset 16 
 .cfi_offset 6, -16 
 movq %rsp, %rbp 
 .cfi_def_cfa_register 6 
 movl $.LC0, %edi 
 call puts 
 movl $0, %eax 
 popq %rbp 
 .cfi_def_cfa 7, 8 
 ret 
 .cfi_endproc 
.LFE0: 
 .size main, .-main 
 .ident "GCC: (Ubuntu/Linaro 4.6.1-9ubuntu3) 4.6.1" 
 .section .note.GNU-stack,"",@progbits

and while this is almost as readable as the little ‘c’ program you typed in, it uses a more detailed description of steps it must take to produce the code that it needs to actually do something.

The next step converts the mnemonic symbols into a numeric language, the real language of computers, called binary, however binary for humans it very hard to read or understand so I’ve taken the binary and ‘disassembled’ it into a hexadecimal number interpretation of the machine language that we can, more or less read. And that looks like this;

 

Now, the first thing that will strike you is that, WOW!, that program got big, but there is more here than just your program, there are the language inclusions of your program. Lets review :

#include <stdlib.h>
#include <stdio.h>
int main(void)
{
printf("Hello World!\n");
return EXIT_SUCCESS;
}

If you didn’t understand what you typed before, you wouldn’t be alone, as there are two lines at the beginning that are “Including” things into my program. These are libraries, extensions you might say, to your program that help the computer understand what it needs to do, to do what you want.