Skip to main content

Hello World - A history
Is it really simple, under the hood?

"What the hell is going on here?" - asked myself when I saw how the teacher created and showed up a black screen with a text of "hello world" in my first day learning programming. I was there with the dream of becoming an opponent of Google, for creating massive web and mobile applications that one day I could beat them, or else, buy them.
For a while feeling heavy-hearted, I realized that I had missed something important which my teacher told us before I came into the class (yes, I was late). Then we practiced with some very first problems like making simple operations, input someone's name and then output "hello <them>",...
I asked Google about "hello world" right after school because I knew that I had missed the most important thing of the starter pack to be a good programmer.

-------------

At first glance, it is supposed to be the most famous program and the first example of every programming language. This program tells the computer to print the text "hello, world" to the screen.

Not only that, but it can be used as a system test program.
For example, after you install the JRE and start it on your computer, it's necessary to make sure a Java program can run now. So why not quickly create a super simple program and make a test? Which one, how about printing something to the console? OK, then we can use the "Hello world", which we have been taught from the very first days. After seeing two words printed on the screen, you surely know that your Java code can be compiled, loaded and run normally, so you don't worry about the environment setup stuff anymore.

So why is it "hello, world", not "I love you 3000"?

Over the past several decades, it’s grown to become a time-honored tradition. Everything seemed to be inherited from another.
The first things about HelloWorld were mentioned in 1973 by Brian Kernighan in his book "The Programming Language B", with a bunch of code lines in B language:
main( ) {
    extern a, b, c;
    putchar(a); putchar(b); putchar(c); putchar('!*n');
}
 
a 'hell';
b 'o, w';
c 'orld';
After that Brian Kernighan wrote his memorandum called "Programming in C: A Tutorial" in 1974, he gave another more explicit example:
main( ) {
        printf("hello, world\n");
}
Then in 1978, another very famous book "The C Programming Language" - also by Brian Kernighan, inherited and actually, influenced by the above memorandum, giving an example program that prints "hello, world" (without capital letters or exclamation mark) and the corresponding output.

Why did our legend choose that phrase? No one knows, even himself. In the interview with Forbes India, he responded that question as below:

“What I do remember is that I had seen a cartoon that showed an egg and a chick and the chick was saying, “Hello, World.”

And that was the beginning of that phrase.
The "C Programming Language" became so famous and been widely used as a textbook in colleges and universities. After that, most teachers, as well as documents, traditionally tend to use that example as theirs.
Moreover, after the born of PDP-11 and its successful commercial (600,000 units sold out), we officially stepped into the era of communicating with machines by programming. The more books about C are sold, the more PDP-11 machines are sold, the more people can do programming and the more "hello, world" becomes famous.

But is this simple program really simple, under the hood?

In my opinion, everything hides its mysticism, even it is so simple. So does "hello, world".
Have you ever asked yourself how the text you type in the editor, put it in as a parameter of a function, can be printed on that black screen?
In fact, "hello, world" must undergo a very long journey to be visible to us as the output.
Remember any usable computer must have at least two parts: software and hardware.
We type some codes to create a program, which is software. How can that software interact with the hardware (specifically a monitor), and make the hardware do things as we want (showing a text)?
The answer is our butler - The Operating System (OS).
An OS is a software that provides the base management of a computer, as well as manages the environment in which other applications run. We can see it as the foundation layer on a computer, a platform on which we can build things (our software).

Supposed we are the text "hello, world", let's go our long journey:
Creating phase: 
The programmer type some code in Java:
//
System.out.println("hello, world");
//
Then he commands to compile the code. We and the instructions are compiled into byte-code, which is stored in the *.class file. The OS provides the API (Application Programming Interface) which Java uses to interact with the kernel mode, where the OS interacts with the hardware (the display or screen). More specific, Java uses the OS System call in a programmatic way under the hood of the "System.out.println". In Linux, the system call to print to console is write():

A brief visualization of how User Application interact OS
More about kernel mode and user mode is here.
So the byte-code now has the content:
Need OS help (use a system call) to print these things (us) to the screen.

Runtime phase:
The programmer then commands to execute the program. We (the byte-code) are now interpreted by the Java Interpreter in the JVM to the machine code (using Just-in-time compilation). The machine code is the instruction in machine language which is executable directly on the CPU.

Slow down a minute, remember when you install Java in your computer, you have to choose the right Java version dedicated to your current OS. That's because JRE is platform-dependent, but it brings the benefits that the compiled Java code (byte-code) is platform-independent.
The Java Interpreter, which is inside the JVM, which is a component of the JRE, is dedicated to a specific OS family. So when you execute the byte-code, the interpreter knows exactly the format structure and standard of the ABI (Application Binary Interface) of the current SO to interpret it to native (machine) code. You can consider it as the compiled version of the API, API is source code-based while an ABI is binary-based.

Every OS family has its ABI standard and format structure. So not only the machine code is hardware architecture-dependent (it is the same for the same CPU architecture, and cannot run in other CPU architectures), but it also has to follow the ABI standards of the OS. So that's why you can bring the Java byte-code to anywhere because the JVM installed in that platform does all the things: interpreting the byte-code following the ABI and the current CPU architecture; since we call Java compiled code is platform-independent.
So the byte-code:
Need OS help (use a system call) to print these things (us) to the screen.
becomes:
Hey Linux, I use these system call (in your format standard) to print these things (us) to the screen.
in machine-code.
The machine-code now can be run on the CPU, and the OS continues its job: managing resources. It assigns a CPU core to handle our program (or follow as the compiled instruction). Perhaps, in the future, it will switch the core to another one for handling our program, this is the context switch of the OS scheduling mechanism, which happens in the preemptive-multitasking OS family (like Linux or Windows), in contrast with the cooperative-multitasking.

The CPU core follows the instructions, to the block of "use the screen to display these things". In fact, CPU doesn't interact with the monitor directly. It will send the command to a driver, which is also a program that operates or controls a particular type of device that is attached to a computer. The driver generates an electric signal for the monitor in a language that the monitor can understand to translates them into a text on the screen.
So we are where we supposed to be.

Our long journey just happened in no more than a second in computing. And through this journey, it's involving interesting mechanisms and concepts like OS resources management, System Calls, Driver, Hardware interaction,...
So it is really simple, but under the hood, it has a bunch of mysticisms, at least for me.

Conclusion

If I could go back to the old days, I would have taught that little stubborn kid some lessons:
 - Never stop asking questions about the mysticisms.
 - Deep dive from the little and simple things, then combine them together, then try again and again, you'll have a bigger picture.
 - Sorry, you wouldn't beat Google son.....Just kidding! Keep dreaming boy, and me too!
So far so good, I'm glad to see you are here for my first blog. Every comment is appreciated by me. See you next time!

Pearly

Comments

Popular posts from this blog

Integrity, Authentication and Non-Repudiation
Why you need these in your transmission security stuffs

" Why the hell hasn't Alice arrived yet? " - shouted by general Bob when he is surrounded by the enemies in the Great Hobbits War II. Bob and Alice are the two commanders of this important campaign. Two days ago, he sent a letter to inform Alice that they must unite in this place to make a combat contract. And later he received a letter supposed to be from Alice, which replied that she consensed. The enemies, led by Frodo, did it all. They intercepted the letter from Bob and made a fake one. The content had been modified that Alice must stand still. This fake information is sent to Alice then. She believed it and sent back a consensus reply. As we can see, since the letter itself is secureless and has no identity. Whether Alice believed or not, she will not move to the place of appointment. These things led to such a disaster for Bob. ----------------------------------- In network computing, we can consider the letter as your account login information on a commercia