Let's start at the beginning

Scott Borduin

WBF Technical Expert (Software)
Jan 22, 2011
56
0
0
Portland, OR area
If you have never written a computer program - and I'm assuming that most of you haven't - you might well be under the impression that computer programming is an esoteric discipline requiring special skills possessed only by a few. But programming - like almost any discipline - can be exercised at a whole range of experience and skill levels. Mathematics at the Phd. level is incomprehensible to most of us, including me, but that doesn't mean that all math is difficult: 2 + 2 = 4. Here's a programming version of 2 + 2. Copy and paste (or type, if you really want the full experience) the following line of code into your browser's address bar (the place where you type in web site addresses):

javascript: alert("Hello world, I'm a programmer!");

Then just click on "OK" to end your new program. Congratulations, you're a programmer! In subsequent posts, you'll learn how to do much more clever things. But before you get too far into self-congratulation, I need to keep you a bit honest here. The reason that little bit of code worked at all is that it built on top of literally millions of lines of other code, written by thousands of programmers you never met. And those programmers themselves worked for perhaps a dozen or more companies, and were mostly strangers to each other. And that whole edifice is, in turn, built on a few brief decades of technological history and development. How on earth did we get here?

Well, there are many pieces to that history, far too many to document here. But here is one big part of that history:



The woman in that picture is Grace Murray Hopper. If celebrity was a function of achievement, Hopper would be one of the 20th century's most remembered figures. I guess I might say that Hopper was the predecessor to Bill Gates and Steve Jobs, but it is probably unfair to Gates and Jobs to compare them to somebody that smart.

What did Hopper do? Lots of things, but arguably her most important achievement was to make the world of modern computer programming possible by developing the concept of a computer programming language.

You may know that the modern computer traces its lineage back to the World War 2 era, when scientists and cryptographers developed machines to help them do calculations and analyze patterns in order to crack enemy communications codes (that's right, hacking and identity theft is what computers were originally invented to do). One famous example is the machine the secret British Ultra project used to decrypt the German Enigma cypher, and win the submarine war in the Atlantic. Another was this machine, the first computer Grace Hopper worked with in 1943.

You'll notice that these computing machines were quite literally machines, with lots and lots of moving parts. They were built to do a specific job, like most machines before or after them. They were narrowly configurable within the context of their explicit purpose, in the narrow sense that, say, your microwave oven is configurable. The mechanical switches in these machines soon gave way to electrical switches - vacuum tubes - and each computer developed a very limited set of low-level tasks that could be triggered: adding or subtracting two numbers, for instance. This set of tasks is often called the instruction set of a computer, and the human-readable expression of that instruction set is often called "machine language". By feeding the computer a very long list of these low level instructions in machine language, you could get it do some work for you.

But programming computers via machine language was extremely limiting. Imagine trying to instruct a fellow human being how to perform a complex task using a total vocabulary of 20 words, with no other visual, auditory, or other sensory cues available. That's what it's like programming in machine language. To replicate your simple Hello World program above in machine language would require many thousands, if not millions, of machine language statements.

Grace Hopper's innovation was to build a very special kind of program called a compiler. A compiler is a program that writes machine language for you, by interpreting a language designed to be understood by humans. A compiler, then, is just a translator, capable of speaking the arcane language of a particular piece of computer hardware when given the text of a language spoken by programmers.

So a compiler accomplishes three important objectives. First, it allows us to program in a language better understood by human brains. Second, it allows programs written in a programming language to be compiled - translated - and run on different kinds of computing hardware (notice, there were no browser-specific or Mac/Windows specific instructions to running our little program above). Third, as we've just seen, it allows the development of a language where just one word or statement can represent many thousands of instructions, and therefore dramatically speeds up the programming process.

Hopper's innovation was the final conceptual step in the development of the modern general purpose computer, the innovation that made the computer an entirely different kind of machine. It is now so familiar to us that we take it for granted. We buy this thing called a computer, with it's case and display and keyboard and mouse. And then, without making a single physical alteration, we load software, and magically convert this "computer" to a machine that can variously edit words, compute the family budget, play games, display pictures, play video and audio, design houses and roads and machines, connect us to friends we'll never meet in person. The computer is the only machine we buy in the expectation that it will get better as we own it - because the software will get better, and enable new and better uses. That is a great part of the wonder of this thing we call software.

In future posts, we'll investigate this wondrous phenomenon from many angles, bottom up, top down, and in between. For now, I hope I've held your interest long enough to continue that investigation with me.
 

audioguy

WBF Founding Member
Apr 20, 2010
2,794
73
1,635
Near Atlanta, GA but not too near!
I started life out of college as a programmer (IBM 7094) and we programmed everything in assembly language. While it took longer to write a specific program, it was WAY more fun than, (as we used to call it) Crudely Oriented, Barely Operable Language (COBOL). When Fortran and Cobol became more acceptable and less buggy, we moved to using them.

WOW! Brings back interesting memories !!
 

Chuck Bessey

New Member
Mar 6, 2011
3
0
0
Thanks Scott - very informative for us in the post Bill Gates era.
If you have never written a computer program - and I'm assuming that most of you haven't - you might well be under the impression that computer programming is an esoteric discipline requiring special skills possessed only by a few. But programming - like almost any discipline - can be exercised at a whole range of experience and skill levels. Mathematics at the Phd. level is incomprehensible to most of us, including me, but that doesn't mean that all math is difficult: 2 + 2 = 4. Here's a programming version of 2 + 2. Copy and paste (or type, if you really want the full experience) the following line of code into your browser's address bar (the place where you type in web site addresses):

javascript: alert("Hello world, I'm a programmer!");

Then just click on "OK" to end your new program. Congratulations, you're a programmer! In subsequent posts, you'll learn how to do much more clever things. But before you get too far into self-congratulation, I need to keep you a bit honest here. The reason that little bit of code worked at all is that it built on top of literally millions of lines of other code, written by thousands of programmers you never met. And those programmers themselves worked for perhaps a dozen or more companies, and were mostly strangers to each other. And that whole edifice is, in turn, built on a few brief decades of technological history and development. How on earth did we get here?

Well, there are many pieces to that history, far too many to document here. But here is one big part of that history:



The woman in that picture is Grace Murray Hopper. If celebrity was a function of achievement, Hopper would be one of the 20th century's most remembered figures. I guess I might say that Hopper was the predecessor to Bill Gates and Steve Jobs, but it is probably unfair to Gates and Jobs to compare them to somebody that smart.

What did Hopper do? Lots of things, but arguably her most important achievement was to make the world of modern computer programming possible by developing the concept of a computer programming language.

You may know that the modern computer traces its lineage back to the World War 2 era, when scientists and cryptographers developed machines to help them do calculations and analyze patterns in order to crack enemy communications codes (that's right, hacking and identity theft is what computers were originally invented to do). One famous example is the machine the secret British Ultra project used to decrypt the German Enigma cypher, and win the submarine war in the Atlantic. Another was this machine, the first computer Grace Hopper worked with in 1943.

You'll notice that these computing machines were quite literally machines, with lots and lots of moving parts. They were built to do a specific job, like most machines before or after them. They were narrowly configurable within the context of their explicit purpose, in the narrow sense that, say, your microwave oven is configurable. The mechanical switches in these machines soon gave way to electrical switches - vacuum tubes - and each computer developed a very limited set of low-level tasks that could be triggered: adding or subtracting two numbers, for instance. This set of tasks is often called the instruction set of a computer, and the human-readable expression of that instruction set is often called "machine language". By feeding the computer a very long list of these low level instructions in machine language, you could get it do some work for you.

But programming computers via machine language was extremely limiting. Imagine trying to instruct a fellow human being how to perform a complex task using a total vocabulary of 20 words, with no other visual, auditory, or other sensory cues available. That's what it's like programming in machine language. To replicate your simple Hello World program above in machine language would require many thousands, if not millions, of machine language statements.

Grace Hopper's innovation was to build a very special kind of program called a compiler. A compiler is a program that writes machine language for you, by interpreting a language designed to be understood by humans. A compiler, then, is just a translator, capable of speaking the arcane language of a particular piece of computer hardware when given the text of a language spoken by programmers.

So a compiler accomplishes three important objectives. First, it allows us to program in a language better understood by human brains. Second, it allows programs written in a programming language to be compiled - translated - and run on different kinds of computing hardware (notice, there were no browser-specific or Mac/Windows specific instructions to running our little program above). Third, as we've just seen, it allows the development of a language where just one word or statement can represent many thousands of instructions, and therefore dramatically speeds up the programming process.

Hopper's innovation was the final conceptual step in the development of the modern general purpose computer, the innovation that made the computer an entirely different kind of machine. It is now so familiar to us that we take it for granted. We buy this thing called a computer, with it's case and display and keyboard and mouse. And then, without making a single physical alteration, we load software, and magically convert this "computer" to a machine that can variously edit words, compute the family budget, play games, display pictures, play video and audio, design houses and roads and machines, connect us to friends we'll never meet in person. The computer is the only machine we buy in the expectation that it will get better as we own it - because the software will get better, and enable new and better uses. That is a great part of the wonder of this thing we call software.

In future posts, we'll investigate this wondrous phenomenon from many angles, bottom up, top down, and in between. For now, I hope I've held your interest long enough to continue that investigation with me.
 

Ethan Winer

Banned
Jul 8, 2010
1,231
3
0
75
New Milford, CT
Great first post Scott, and that photo of Grace Hopper is a keeper. As a 30-year programming veteran (DOS compiled BASIC, x86 ASM), I look forward to reading more from you.

--Ethan
 

Old Listener

New Member
Jul 18, 2010
371
0
0
SF Bay area
naturelover.smugmug.com
Great post Scott. Few inventions have had as much potential for seemingly endless applications as the programmable computer has had. The invention of the compiler was a necessary step.

Perhaps you might write about the Fortran branch of compiler development. The ideas of libraries, reuse, modularity and separate compilation proved to be central to modern software development

For me, writing software starting in 1967 was like working on an open frontier. As hardware got more capable, new things became feasible. It was both work and recreation for me. The level of intellectual stimulation in designing and writing software (and testing it) would have been hard to match in other work I could have done.

Bill
 

vinylphilemag

WBF Founding Member
Apr 30, 2010
810
1
328
56
Kelowna, BC
www.vinylphilemag.com
I'm another veteran. Interpreted BASIC on my Commodore PET was my first language, followed by 6502 Assembly, Pascal, C, plus other (some quite esoteric) languages.
 

FrantzM

Member Sponsor & WBF Founding Member
Apr 20, 2010
6,455
29
405
Hi

Basic, then Fortran, PL1 and Pascal then never programmed again although these days thinking about Java as a hobby .. Great post .. Scott
 

Scott Borduin

WBF Technical Expert (Software)
Jan 22, 2011
56
0
0
Portland, OR area
... Crudely Oriented, Barely Operable Language (COBOL). When Fortran and Cobol became more acceptable and less buggy, we moved to using them ...

ALGOL and punchcards here.....

Fortran IV, Basic and 4GLs here...

DOS compiled BASIC, x86 ASM

the Fortran branch of compiler development ...

Interpreted BASIC on my Commodore PET was my first language, followed by 6502 Assembly, Pascal, C, plus other (some quite esoteric) languages ...

Basic, then Fortran, PL1 and Pascal then never programmed again although these days thinking about Java as a hobby ..

I see I've found the programming community :) FORTRAN, C, C++, Lisp, Scheme, C#, and JavaScript here, with bits of others thrown in.

I hope that the proportion of comments from programmers vs. non-programmers will balance out over time, because I want to reach the non-experienced audience too. I'm going to try to continue to describe this stuff in a way that will have general interest. But hey, the reward in doing something like this is to get a sense of reaching other people, and I'm grateful for everybody's feedback.

(Quote from Old Listener)

Perhaps you might write about the Fortran branch of compiler development. The ideas of libraries, reuse, modularity and separate compilation proved to be central to modern software development

For me, writing software starting in 1967 was like working on an open frontier. As hardware got more capable, new things became feasible. It was both work and recreation for me. The level of intellectual stimulation in designing and writing software (and testing it) would have been hard to match in other work I could have done.

In the vague plan currently just in my head, one of the posts will investigate the layers of structure that programmers use to organize complexity - functions, libraries, objects. I will probably spend a bit of time on the history of such things, but use JavaScript to illustrate them, because JS is the programming environment which basically everyone has these days. And it is far more than the toy language I thought it was back in the late 90's ...

I think the analogy of writing software to working on the open frontier is a powerful one. While many more people are writing software today than in '67, the thrill of inventing something novel remains. The computer is like an incredibly flexible canvas capable of capturing the pure intellectual energy and creativity of the people painting upon it. To see something emerge just based on your own ability to envision and describe it is powerfully rewarding and addictive.
 

amirm

Banned
Apr 2, 2010
15,813
37
0
Seattle, WA
Anyone ever use the language called apl.?
I did! I loved it. It has a super dense syntax. The efficient person in me loved it. Ran it on a timesharing Univac mainframe at school. That was the downer.

I forget now but I think the assignment I had to do on it was solving routing problem. It was fun both figuring out the logic and how to get the darn language to implement it.
 

The Smokester

Well-Known Member
Jun 7, 2010
347
1
925
N. California
No :). APL stands for A Programming Language. I know, not very creative. :)

Apple's first computer was out during the same time APL lived (Apple II). It ran basic and assembly language only. APL ran on mainframes as I mentioned.

Right. You could probably optimize the internet with one line of apl code. It was so terse that people had a problem remembering what the line meant 5 minutes after they wrote it and I think largely for that reason it died out.

Since retiring I use mainly a language called Mathematica. It's a very flexible language that can take on some of the advantages--and downsides if you're not careful--of apl. Mostly used as an interpreted language but can be selectively compiled or interfaced with lower-level languages.
 
Last edited:

microstrip

VIP/Donor
May 30, 2010
20,807
4,700
2,790
Portugal
My first hardware programming project was carried with 8080-8085 Assembly Language with an SDK85 Intel system. MOV A, C ; ADD B - does any one remember those?
Currently our students learn Python, an object oriented language, as their first programming language.
 

flez007

Member Sponsor
Aug 31, 2010
2,915
36
435
Mexico City
Anyone ever use the language called apl.?

I did for some short period of time until I was re-assigned to a DBA role (IMS) for the mainframe we had by that time. It was a fun language as I recall with lots of potential for interactive dialogue with users.
 

The Smokester

Well-Known Member
Jun 7, 2010
347
1
925
N. California
My first hardware programming project was carried with 8080-8085 Assembly Language with an SDK85 Intel system. MOV A, C ; ADD B - does any one remember those?
Currently our students learn Python, an object oriented language, as their first programming language.

Yes. Me too. We built our own hardware interfaces and programmed the "handlers" in assembly language (Macro) for data acquisition systems on PDP-8's, 15's and 11's.
 

About us

  • What’s Best Forum is THE forum for high end audio, product reviews, advice and sharing experiences on the best of everything else. This is THE place where audiophiles and audio companies discuss vintage, contemporary and new audio products, music servers, music streamers, computer audio, digital-to-analog converters, turntables, phono stages, cartridges, reel-to-reel tape machines, speakers, headphones and tube and solid-state amplification. Founded in 2010 What’s Best Forum invites intelligent and courteous people of all interests and backgrounds to describe and discuss the best of everything. From beginners to life-long hobbyists to industry professionals, we enjoy learning about new things and meeting new people, and participating in spirited debates.

Quick Navigation

User Menu

Steve Williams
Site Founder | Site Owner | Administrator
Ron Resnick
Site Co-Owner | Administrator
Julian (The Fixer)
Website Build | Marketing Managersing