Sunday, July 8, 2012

The Limits of Human Understanding



     Contrary to popular belief, there is a very real limit to what humans can understand in real time. Sure, we can build new theories out of old theories and publish them to great effect. But, in terms of real-time, moment to moment operation, humans are very limited in their capacity to carry out complex chains of instructions.

     In other words, academia has long since outlived it's direct usage to that average human. In order for us to gain the fruits of state-of-the-art science, a for-profit corporation (like Microsoft) has to come along and develop Applications with a user-interface that falls squarely within the attention-span of the average human. Scientists are just as susceptible to this phenomenon as brick-layers (just look at the sales for Mathmatica and Matlab). Nobody is exempt.

     Computer scientists simply create new abstractions and sell them as "Apps". This is the grand totality of all human knowledge made tangible. If we simply accept this limitation in momentary human awareness, we can begin to make progress in education. For instance, if we know the area limits of a subject wherein the concepts involved are starting to become so complex that people have ceased to use them publicly (Algebra: do you still know what an imaginary number is?), then we need only design the next group of abstractions to address those operations on the outer limits using some intuitive interface.

     A good (if oversimplified) example of this is the nines tool. You should know this, but I'll repeat it for argument's sake: Nine times any single digit number is equal to the next lowest integer in front of the number needed to create nine: 9 * 8 = 72. The next lower number from 8 is 7 and the number needed to create 9 with 7 is 2. Therefore, 9 * 8 = 72. This intuitive leveraging tool allows us to internalize some of the more difficult operations and devote our momentary awareness to the next plateau of concepts; in this case memorization of multiplication tables. Instead of fighting our limitations, accept them and layer our approach to education and knowledge creation around them.

     The pressure of routine life and death decisions can create some startling pragmatic observations. For instance, the military (specifically the Air Force) has known about the limits of human understanding for a long time now. They even have a phrase for it, "task saturation". They also invented the popular term "information overload". They now know that abstractions need to be employed in an intuitive way in order to keep their pilots alive. The academic world can take a lesson from them.

     After all, there are only so many ways a person can interact with the world around them. Thankfully, all but the world's most mind-boggling processes can be represented in visual or tactile ways that require a minimal amount of effort from the student (or jet pilot). The problem is finding these analogues and employing them in a consistent way. A good starting point would be to incorporate the types of ideas found in books like Arthur Benjamin's Secrets of Mental Math into primary school education. Of course, most of the problems a computer scientist deals with are well beyond mental math. But, there are always easier ways to solve a problem that do not involve symbolic rigor. For example, when you first approach systems of linear equations in a college algebra class, it can be a little overwhelming. There are different approaches to solving the problem using algebraic manipulation of symbolic notation. However, the easiest way of all is to plot each line and see where they all intersect. There are tricks like this at all levels of mathematics. We need only to take advantage of them.

     For my purposes, just accepting these limitations is useful in and of itself. It means that students are allowed to think about programming in terms of what they know, and not in terms of whatever Latin-inspired symbolism is being espoused by the popular culture of the times.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.