Thursday, July 12, 2012

Why programming has never really caught on...

Wait, I know what you're thinking


     "Is this guy retarded or something? Programming is a multi-billion dollar business! Of course it's caught-on!" That's not what I mean. Sure there are thousands of professional coders out there, and millions of end-users (OK probably closer to billions). But, why don't those end-users code? The closest we, as a society, came to utilizing the power of programming en masse was the HTML/CSS boom of the early 2000s. Now, thanks to wordpress and scores of other turnkey solutions, that boom is in the rear view. Why aren't average schmoes coding?

     The reason has to do with symbolic leveraging. This is a cognitive phenomenon that occurs with any language; human or otherwise. The point behind symbolic leverage is that a symbol only has meaning if the reader has a built-in association for it. A good example is the Indus Valley Script. The civilization who read that script is extinct. So, the language is now useless. To make matters worse, the language was highly developed -- not like hieroglyphs whose meaning could often be gleaned by sight alone. The IVS symbols are highly abstracted (like the Arabic script used in English). The more highly abstracted a symbol (or programming module) is, the more useless it is to the uninitiated. 

     The IVS leverages it's symbols on information that already exists in the user's mind. This allows it to quickly cover massive amounts of highly complex concepts with an economy of words. This is great for the lazy writer, and horrible for the student. In the case of written 'natural languages' (there's no such thing -- but that's another story), this only works in generalized societies with institutionalized rules which attempt to enforce equality and freedom of property-rights. After all, why spend your hard-earned money to train slaves or indentured servants to read when they can't own property and will only ever do manual labor.  

     In our own world society, this has only been the case for about 150 years (in some places far far less) out of a recorded history of about 7000 years and a biological history of about 500,000. Amazingly enough, the bulk (some believe over 80%) of the last 500,000 years seem to have been spent without verbal language! In the historical past, only the rich had the time and resources to hire a teacher to decode their own written language for their education; and that was usually because their caste depended on property exchanges  -- which are encoded in written language -- to maintain their status. Even then, it took up years upon years of the human student's most prodigious period of learning to achieve this: childhood. In short, language itself is a pretty recent invention!

     Now enter computer languages: most are less than 20 years old, most are not taught to the public, and ALL rely on symbolic leveraging to a degree that has never been known in the history of humanity. Even the modern math that computer languages are based on is less than a century old. Add to that the fact that computer languages are typically written by mathematicians and engineers; those people on planet earth with the absolute WORST of communication skills and by far the greatest amount of abstract symbols with which to leverage their new -- impossibly terse -- computer language.

     The reason why this is a problem can be best explained in computer programming terms. In computer science, the programmer who chooses to write a program which discretely solves all possible problems a user can throw at him (brute-force programming) is called a schmuck. A good programmer, instead, creates declarative rules for problem solving, then creates a form of artificial intelligence which can use those rules to solve each new problem as it occurs. Unfortunately, they never apply this principle to the structure of the programs they create. As a result, each stand-alone program is itself just a discrete collection of responses to all the foreseeable problems the user can throw at it. Ironically, we coders continue to find ever newer and more abstract techniques of using complex declarative rules to create a solution which, in and of itself, is merely a discrete collection of 'canned' responses to a shockingly limited amount of inputs. 

     As a result, the average user has no idea how to automate even the simplest of tasks on his computer. The language of automation is so obfuscated with entropic jargon and arbitrary syntax inspired by abstruse flights of mathematical fancy, that he would need to devote a minimum of four years to his life (at the expense of all his other activities) in order to gain the knowledge needed to make his computer do what it was supposed to be doing for us all along. The only way we know to solve this conundrum is by paying a coder way too much money to come along and give us a handful more 'canned responses' which themselves are impossible to automate for the end-user. And so the cycle continues.

     The reason why the natural language programming paradigm won't die (no matter how much it is demeaned by pro coders) is that it offers a way to do just that; to increase the intelligence of the computers we end-users use and unleash the truest power of the computer -- the power to automate. Until coders change the way they think about coding, computers will never reach the next level. Each new OS will look more and more like the last one and the role of the CS pro will begin to diminish instead of increase. It's already starting to happen. Think about how little progress has occurred between Windows XP and 8, OSX Tiger and Lion, Ubuntu 10 and 13.

     The future of commercial programming is natural language processing and it's bosom-buddy: AI. We need to start using the programming language as a way, not only to create powerful and economical abstractions, but as also as a way to begin absorbing the burden of symbolic leveraging instead of placing it entirely on the programmer. It's already happening; just witness the astronomical growth of Python and the cutting edge Go project from Google. We have reached the limits of our capability to deal with abstractions effectively, this is why Haskell will never really catch-on, and why Lisp never really did. 

     Imagine a world where we use the computer, not just to gossip with friends and send messages to one agency or another, but where the computer is an extension of your body; your will. The computer will be like an open-source personal secretary. Freed from repetitive monotonous tasks, the end-user will be allowed to a lot their development time towards specializing in that most noble professional niche; creativity. We will be free to apply our mind to tasks that really matter to our human survival in ways that only humans can do using our imaginations. In terms of creative vision, that beats the crap out of Facebook! 

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.