Definition of algorithm

An algorithm is deterministic automaton for accomplishing a goal which, given an initial state, will terminate in a defined end-state. The efficiency of implementation of the algorithm depends upon speed, size, resources consumption. We will discuss definitions, classifications and the history.

Contents

What is an algorithm?

No agreed-to definition of "algorithm" exists.

A simple definition: A set of instructions for solving a problem.

The algorithm is either implemented by a program or simulated by a program. Algorithms often have steps that iterate (repeat ) or require decisions such as logic or comparison.

An very simple example of an algorithm is multiplying two numbers: on first computers with limited processors, this was accomplished by a routine that in a number of loop based on the first number adds the second number. The algorithm translates a method into computer commands.

Algorithms are essential to the way computers process information, because a computer program is essentially an algorithm that tells the computer what specific steps to perform (in what specific order) in order to carry out a specified task, such as calculating employees' paychecks or printing students' report cards. Thus, an algorithm can be considered to be any sequence of operations which can be performed by a Turing-complete system. Authors who assert this thesis include Savage (1987) and Gurevich (2000):

...Turing's informal argument in favor of his thesis justifies a stronger thesis: every algorithm can be simulated by a Turing machine" ...according to Savage [1987], an algorithm is a computational process defined by a Turing machine.

Typically, when an algorithm is associated with processing information, data is read from an input source or device, written to an output sink or device, and/or stored for further processing. Stored data is regarded as part of the internal state of the entity performing the algorithm. In practice, the state is stored in a data structure.

For any such computational process, the algorithm must be rigorously defined: specified in the way it applies in all possible circumstances that could arise. That is, any conditional steps must be systematically dealt with, case-by-case; the criteria for each case must be clear (and computable).

Because an algorithm is a precise list of precise steps, the order of computation will almost always be critical to the functioning of the algorithm. Instructions are usually assumed to be listed explicitly, and are described as starting 'from the top' and going 'down to the bottom', an idea that is described more formally by flow of control.

So far, this discussion of the formalization of an algorithm has assumed the premises of imperative programming. This is the most common conception, and it attempts to describe a task in discrete, 'mechanical' means. Unique to this conception of formalized algorithms is the assignment operation, setting the value of a variable. It derives from the intuition of 'memory' as a scratchpad.

For some alternate conceptions of what constitutes an algorithm see functional programming and logic programming.

Definitions of algorithms

Blass and Gurevich

Blass and Gurevich describe their work as evolved from consideration of Turing machines, Kolmogorov-Uspensky machines (KU machines), Schönhage machine (storage modification machine SMM), and pointer machines (linking automata) as defined by Knuth. The work of Gandy and Markov are also described as influential precursors.

Gurevich offers a 'strong' definition of an algorithm (that is summarized here):

The Turing's informal argument in favor of his thesis justifies a stronger thesis: every algorithm can be simulated by a Turing machine. In practice, it would be ridiculous. Can one generalize Turing machines so that any algorithm, never mind how abstract, can be modeled by a generalized machine? But suppose such generalized Turing machines exist. What would their states be?A first-order structure. A particular small instruction set suffices in all cases. Computation could be an evolution of the state, could be nondeterministic, interact with its environment, could be parallel and multi-agent, could have dynamic semantics. The two underpinings of thier work are: Turing's thesis andthe notion of first order structure of Tarski.

The above phrase computation as an evolution of the state differs markedly from the definition of Knuth and Stone, the "algorithm" as a Turing machine program. Rather, it corresponds to what Turing called the complete configuration, and includes both the current instruction (state) and the status of the tape. Kleene (1952) shows an example of a tape with 6 symbols on it, all other squares are blank, and how to "Gödelize" its combined table-tape status.

Boolos and Jeffrey

Their definition is:

"Explicit instructions for determining the nth member of a set, for arbitrary finite n. Such instructions are to be given quite explicitly, in a form in which they could be followed by a computing machine, or by a human who is capable of carrying out only very elementary operations on symbols."

Knuth

Knuth (1968, 1973) has given a list of five properties that are widely accepted as requirements for an algorithm:

  1. Finiteness: "An algorithm must always terminate after a finite number of steps"
  2. Definiteness: "Each step of an algorithm must be precisely defined; the actions to be carried out must be rigorously and unambiguously specified for each case"
  3. Input: "...quantities which are given to it initially before the algorithm begins. These inputs are taken from specified sets of objects"
  4. Output: "...quantities which have a specified relation to the inputs"
  5. Effectiveness: "... all of the operations to be performed in the algorithm must be sufficiently basic that they can in principle be done exactly and in a finite length of time by a man using paper and pencil"

Knuth offers as an example the Euclidean algorithm for determining the greatest common divisor of two natural numbers.

Knuth admits that, while his description of an algorithm may be intuitively clear, it lacks formal rigor, since it is not exactly clear what "precisely defined" means, or "rigorously and unambiguously specified" means, or "sufficiently basic", and so forth. He makes an effort in this direction in his first volume where he defines in detail what he calls the "machine language" for his "mythical MIX... the world's first polyunsaturated computer".
Many of the algorithms in his books are written in the MIX language. He also uses tree diagrams, flow diagrams and state diagrams.

Markov

A. A. Markov (1954) provided the following definition of algorithm:

1. In mathematics, "algorithm" is commonly understood to be an exact prescription, defining a computational process, leading from various initial data to the desired result....
The following three features are characteristic of algorithms and determine their role in mathematics:
a) the precision of the prescription, leaving no place to arbitrariness, and its universal comprehensibility -- the definiteness of the algorithm;
b) the possibility of starting out with initial data, which may vary within given limits -- the generality of the algorithm;
c) the orientation of the algorithm toward obtaining some desired result, which is indeed obtained in the end with proper initial data -- the conclusiveness of the algorithm.

He admitted that this definition "does not pretend to mathematical precision". His 1954 monograph was his attempt to define algorithm more accurately; he saw his resulting definition -- his "normal" algorithm -- as "equivalent to the concept of a recursive function". His definition included four major components:

  1. Separate elementary steps, each of which will be performed according to one of the given substitution rules.
  2. Steps of local nature (the algorithm won't change more than a certain number of symbols).
  3. The scheme of the algorithm is a list of rules for the substitution formulas.
  4. A means to distinguish a concluding substitution (a final state).

In his Introduction Markov observed that the entire significance for mathematics of efforts to define algorithm more precisely would be in connection with the problem of a constructive foundation for mathematics.

Minsky

Minsky (1967) asserts that an algorithm is an effective procedure and replaces further in his text algorithm by effective procedure. The term is also used by Knuth. Here is its definition of effective procedure:

A set of rules which tell us, from moment to moment, precisely how to behave.

But he recognizes that this is subject to a criticism:

The interpretation of the rules is left to depend on some person or agent.

He made a refinement: to specify, along with the statement of the rules, the details of the mechanism that is to interpret them. To avoid the cumbersome process of having to do this over again for each individual procedure he hopes to identify a reasonably uniform family of rule-obeying mechanisms. Here is his formulation:

(1) a language in which sets of behavioral rules are to be expressed, and
(2) a single machine which can interpret statements in the language and thus carry out the steps of each specified process.

In the end, though, he still worries that "there remains a subjective aspect to the matter. Different people may not agree on whether a certain procedure should be called effective".

But Minsky is undeterred. He immediately introduces "Turing's Analysis of Computation Process". He quotes what he calls Turing's thesis:

Any process which could naturally be called an effective procedure can be realized by a Turing machine.
(This is also called Church's thesis).

After an analysis of "Turing's Argument" he observes that equivalence of many intuitive formulations of Turing, Church, Kleene, Post, and Smullyan "leads us to suppose that there is really here an objective or absolute notion".

Stone

Stone (1972) and Knuth (1968, 1973) were professors at Stanford University at the same time so it is not surprising if there are similarities in their definitions:

To summarize ... we define an algorithm to be a set of rules that precisely defines a sequence of operations such that each rule is effective and definite and such that the sequence terminates in a finite time.
Stone is noteworthy because of his detailed discussion of what constitutes an effective rule for his robot, or person-acting-as-robot, must have some information and abilities within them, and if not the information and the ability must be provided in "the algorithm":
For people to follow the rules of an algorithm, the rules must be formulated so that they can be followed in a robot-like manner, that is, without the need for thought... however, if the instructions [to solve the quadratic equation, his example] are to be obeyed by someone who knows how to perform arithmetic operations but does not know how to extract a square root, then we must also provide a set of rules for extracting a square root in order to satisfy the definition of algorithm.
(...) not all instructions are acceptable, because they may require the robot to have abilities beyond those that we consider reasonable.

He gives the example of a robot confronted with the question: "Is Henry VIII is a King of England", print 1 if yes and 0 if no, but the robot has not been previously provided with this information. And worse, if the robot is asked if Aristotle was a King of England and the robot only had been provided with five names, it would not know how to answer. Thus:

An intuitive definition of an acceptable sequence of instructions is one in which each instruction is precisely defined so that the robot is guaranteed to be able to obey it.

After providing us with his definition, Stone introduces the Turing machine model and states that the set of five-tupes that are the machine's instructions is an algorithm... known as a Turing machine program. Puis he says that a computation of a Turing machine is described by stating:

1. The tape alphabet
2. The form in which the parameters are presented on the tape
3. The initial state of the Turing machine
4. The form in which answers will be represented on the tape when the Turing machine halts
5. The machine program.

This is in the spirit of Blass and Gurevich.

Some issues

Expressing algorithms

Algorithms can be expressed in many kinds of notations:
- Natural language expressions of algorithms tend to be verbose and ambiguous, and are rarely used for complex or technical algorithms.
- Pseudocode and flowcharts are structured ways to express algorithms that avoid the ambiguities, while remaining independent of a particular implementation language.
- Programming languages are primarily intended for expressing algorithms in a form that can be executed by a computer, but are often used as a way to define or document algorithms.

Must an algorithm halt?

Some writers restrict the definition of algorithm to procedures that eventually finish. Others, as Kleene, include procedures that could run forever without stopping. Such a procedure has been called a "computational method" by Knuth or "calculation procedure or algorithm" by Kleene. However, Kleene notes that such a method must eventually exhibit "some object".

Minksy (1967) makes the observation that, if an algorithm hasn't "terminated" then how can we answer the question: "Will it terminate with the correct answer?'"

Thus the answer is: undecidable. We can never know, nor can we do an analysis beforehand to find out. The analysis of algorithms for their likelihood of termination is called "Termination analysis".

Algorithm analysis

The terms "analysis of algorithms" was coined by Knuth. Most people who implement algorithms want to know how much of a particular resource, such as time or storage, is required for the execution. Methods have been developed for the analysis of algorithms to obtain such quantitative answers.

The analysis and study of algorithms is one discipline of computer science, and is often practiced abstractly (without the use of a specific programming language or hardware). But the Scriptol code is portable, simple and abstract enough for such analysis.

Simple example: multiplication

int multiply(int x, int y) 
   int sum = 0 
   while y > 0 
      sum + x 
   let y - 1 
return sum 

int a = 5 
int b = 7 

print a,"x", b, "=", multiply(a, b)

References

See also...