Visit the High Level Logic (HLL) Website
— Home of XPL (eXtensible Process Language)
It's like XML, but you can store source code in it.
Link to Chapter 1
I understood the differences between what application developers wanted to do and what the “artificial intelligence” technology of the late 1980s supported. The differences were much greater than could be dealt with in a few software updates. What had been, in effect, a broad survey of application needs resulted in a snap-shot of a more basic set of technical requirements. This snap-shot taught me much about the path of development of software technology generally, decades into the future.
Much of that future would evolve with or without me, as developers pushed to realize their dreams and the more basic technology developers – involved in computer languages for example – responded to demand by building their tools bottom-up. But I had become obsessed with what initially seemed like an unapproachable question – one that might raise a slight giggle around a lunch table. As computer languages and tools evolve to higher and higher levels - “bottom-up” - where will they eventually reach? Where's the top? Even if the question is unanswerable, the more I thought about it, the more I found value in the thinking.
There were also two general observations about people that allowed me to consider that my ideas would be at least somewhat unique. The first is that people often tend to mistake common, ordinary ideas as trivial. In fact, the opposite is more likely true. If something is common, it is likely important, or at least extremely useful. Simply cast the thought in different words such as the basic economic measurements of supply and demand, for example, and more people will agree. Something becomes common in any culture because there is a demand for it. The other was that, at the time, it was rather difficult to get many experts to think about the difference between the idea of a machine doing something on its own, and what they were doing. Many programmers scoffed at the idea of specialized rule-processing engines, because they could already use if-then statements in their own programs. Management consultants and statisticians could not see a reason for interest in theoretical math engines that could build their own models and solve problems because statistical packages already allowed them to select from a variety of basic mathematical forms and discover a set of parameter values that produced a “best fit.”
Even as I write today, a quarter century after my thoughts on “high level logic” began, I wonder if I will hit the same snags as I try to drum up interest. That software tools evolve to support higher level functionality is quite commonly understood. And then there is the trigger that set my search in motion. Every systematic analyst (whether professional or not) knows that it is important to define and understand a problem before expending much effort to solve it. They've done it themselves, over and over again, many times.
Let me proceed then to something else that was already common before I began my quest. Not only is it not my invention, but if I recall correctly, it was first introduced to me in grade-school. This is certainly the kind of thing that could potentially send readers to another page, thinking that it is proof that I have nothing new to say. A general problem solving algorithm has been described, that begins with defining the problem. I trust that the process will be recognized by a great many people.
1. Define the Problem or Task
2. Assemble Detailed Information
3. Perform Analysis
4. Identify Alternatives
5. Recommend Solutions
This is a description of an extremely general process; and that certainly is a characteristic sought after in the search for high level logic. Not only do professionals systematically follow such a process in dealing with complex problems and issues, but it is difficult to think of any, even slightly complex process that cannot be cast in its mold. If I, for example, feel hungry at 10:30 at night while watching television, I will already have defined the problem. I am hungry. I will then proceed to the kitchen to assemble information. Looking into the fridge, I will see what is available. I will analyze that information, considering how I might use the ingredients to make something to eat. That will produce a set of alternatives, from which I will choose. (“Recommend Solutions” is a phrase chosen in context. I had been considering the development of better “expert systems.”)
Having given a good bit of thought to rule-based expert system technology, it was easy to see that the above general problem solving algorithm could be used as the design of a software engine, which I describe below. Once I was certain that the idea was concrete and feasible to implement, I began thinking of it as the rule-processing equivalent of “structured programming” - i.e. the addition of for, while, and if-then statements into “high level” languages and functions replacing goto statements. Not miracles, but an extremely useful advance in computer technology.
Rule-based expert systems had a great deal of success in focused diagnostics, which meant no new investigation was needed to see that the same approach could be used in the first step of the general problem solving algorithm. To fill things out however, the process needed to proceed to step 2 rather than too quickly toward the last step – making recommendations. The concept of “objects” (although not necessarily their implementation in object oriented programming languages) easily suggested the transition from step 1 to step 2. Each meaningful “entity” involved in the problem definition would have associated characteristics and resources. Resources can include sources of information, such as databases or flat files. Where circumstantial information is needed in each run, either updated information was needed from these information sources, or the user would be asked.
The need to integrate step 3 into the process was one of the best known problems at the time. The age of open systems development had not yet arrived. Popular rule-based expert system tools had been built as stand-alone products to support only rule-processing, and could not be integrated with other programs and were not extensible with other programs. The JBoss rule processing system mentioned in chapter 1 deals with this problem directly. The rule-processing software is built in Java, carrying with it the characteristics of the underlying language. A rule-process can be initiated from another program and program objects can be referenced directly in rules. This allows programmers much greater flexibility to fashion their own hard-coded relationships between rules and other components in each application.
In a more general approach, these relationships should be discovered automatically in the resources associated with the entities defined for the problem definition stage, the same mechanism used in support of the second step. Then, applications can change merely by changing the data associated with the application, just as is done in general rule-processing and expert systems. The program itself, a general processing engine, would not need to be changed, either when updating an application, or building entirely new applications.
In the final two steps, I primarily fell back on rule-processing and rule-based expert systems concepts again, but operating on the information accumulated from the first three steps.
Another thought occurred to me that fit the idea of artificial intelligence research perfectly. It might make my imagination seem slightly flexible, but I saw a one-to-one relationship between the general problem solving algorithm and the structure of good presentations I had learned in a high school speech course. (I was in fact considering just how general the above algorithm was, and how broadly it could be applied. There seemed no end to the possibilities, when one started replacing words like “problem” while maintaining the same processing logic.)
Monroe's motivated sequence
The basic speech structure
A. Attention step
B. Need step
C. Satisfaction step
D. Visualization step
E. Action step |
A. Introduction
B. Body of the speech
C. Conclusion |
It occurs to me now, so many years after the fact, that it is too difficult to recall all the thoughts that brought the general problem solving process into correspondence with the idea of presentations. I am not a numerologist, but cannot do better at this moment than to mention that both processes have five steps. After considering several examples, I became convinced that by adding well developed descriptive text as additional resource elements associated with problem entities, a program could automatically generate well-formulated reports in five sections; complete with data gathered and derived through each step in the problem solving process. The reports I envisioned could be as good as, if not better, than those produced by many human consultants.
This was interesting, but obviously not the top of the logic hierarchy. Many different problems arise each day – many different tasks to take care of. It cannot be said that solving one or many represents the top of the human activities management process. Somewhere around 1990, I met with an industrial engineering professor who had been interested in artificial intelligence applications for years. We had a long chat about the possibility of completely automated factories. This was still a decade before frequent online purchasing and customer management systems. But it seemed reasonable to contemplate a future in which everything from initial customer contact, sales, accounting, instructions to the factory floor, robotic manufacturing on demand, packaging, right out to the shipping dock would be fully automated.
Again – not a numerologist – but note that a more general description of the presentation process is given in just 3 parts. Coincidentally, that is the number of elements that I think necessary in the next step in high level logic – the subject of
chapter 3.