Algorithm Lesson Notes
An algorithm is the answer to a question that begins "How do you...?"
- The notion of an algorithm is a very powerful one, for a single algorithm "extracts" what a large number of different happenings may have in common.
- Whereas an eye-witness account records what happens once, an algorithm encodes what could ever happen.
- Reduce actions to sub-actions until you see loops and conditionals.
Some notes from Knuth: "A process or set of rules to be followed in calculations or other problem-solving operations, esp. by a computer: a basic algorithm for division."
It is generally assumed that an algorithm must satisfy the following five requirements (cf Knuth, pages 1-9):
- Finiteness: "An algorithm must always terminate after a finite number of steps." (p. 4) "A useful algorithm should require not only a finite number of steps, but a very finite number, a reasonable number." (p. 6, Knuth's italics)
- Definiteness: "Each step of an algorithm must be precisely defined; the actions to be carried out must be rigorously and unambiguously specified for each case." (p. 5)
- Input: "...quantities which are given to it initially before the algorithm begins. These inputs are taken from specified sets of objects" (p. 5)
- Output: i.e., an effect. "Quantities which have a specified relation to the inputs." (p. 5)
- Effectiveness: "... all of the operations to be performed in the algorithm must be sufficiently basic that they can in principle be done exactly and in a finite length of time by a man using paper and pencil." (p. 6)
My version of this would eliminate #1. Why does it have to be finite? It can have effects while it's running. And combine #3 and #4 to be a sense of using resources. Data that is manipulated. Specifically, input doesn't need to be given before the algorithm begins. And the output does not need to have a relation to the input (eg, Hello world).
Wikipedia says that Euclid's Algorithm for finding the GCD of two integers is "significant in that it is one of the oldest algorithms known, dating back to the Greeks." Euclid (300 BC). FALSE. As long as there have been sentient beings there have been algorithms. Old recorded algorithms: Code of Hammurabi (1780 BC), agriculture techniques in Egyptian paintings.
We can begin by playing lots of games. Everyone can bring a deck of cards and we can play a giant game of Nertz. Perhaps playing cards can be the first "resources" that we develop our own algorithms for. Hmmm, a rich resource, subdividable into various categories of integers (odd, even, face); suits; multiple decks. Steps can include moving a card between piles, looking at a card. Perhaps everyone should be required to have a deck of cards. Adapt some of the algorithm exercises for cards (arranging).
I need to develop a formal vocabulary for discussing algorithms. This should be applicable to every algorithm we discuss. So for example I should be able to take any algorithm from the Exercises and break it down using this formal vocabulary.
An algorithm has to have steps. Step 1, step 2, step 3, etc. Time seems to be an important element.
I'm not liking the data and data structure split any more. I'm going to rework this.
- action – An atomic instruction, command, operation, etc. Whatever makes sense in the domain of the data. Actions can include the use of tools that may or may not have their own data.
- data – The stuff that's being manipulated, affected, changed, altered, looked at, etc. Cards, numbers, names, Playstation parts, etc. Also data structure: a higher level arrangement of the data. A deck of cards. A data structure can be manipulated in its own right, and the individual elements can be manipulated.
- conditional – A sort of meta-action that performs an action only if a certain condition is met (said to be 'true'). Expressed in pseudocode as if ... then ... else. A conditional is a special kind of action that is independent of the data-domain specific actions (the condition however is data-domain specific).
- loop – Technically just a branch (as in "loop back to the beginning") or a conditional and a branch (as in "loop back to the beginning if there's more to do"), or an action repeated some fixed number of times (as in "do this five times"). But it's such a common construct in algorithms that it is useful to think of it as a basic element. Also called a cycle. It's easiest just to recognize where loops occur in algorithms, as in "keep doing this until the condition is met" or "repeat this ten times." Theoretically speaking a loop is just an optimization, replacing a repeated block of actions (eg, "step left, step right, step left, step right, step left, step right") with one block and a repeat (eg, "repeat the actions step left, step right three times"). But who wants to unroll all those loops?
These two are less important to developing a theoretical appreciation of algorithms. A branch is actually more basic than a loop, but I find it easier to talk about conditionals and loops with the branching implied. However if you want to build a computer from the ground up it's all about branches (especially conditional branches). A block is just an easy way to talk about groups of actions.
- branch – A skip ahead or back to another action, interrupting the normal flow of actions. Also called a jump or go-to.
- block (of actions) – A group of adjacent actions. Blocks are a way to lump actions together to aid understanding an algorithm. Conditionals and loops will often specify a block (more than one action) to perform.
How do you...?
An algorithm is the answer to a question that begins "How do you...?"
Good introductory idea: Each person completes their own version of the sentence "How do you...?" and then finds an answer from reference material. They then break the answer down into formal vocabulary, identifying data and data structures, actions, conditionals, blocks and loops. I just looked up "How do you gut a fish?" and got this wonderful scaling algorithm:
To see if the fish needs scaling, run the back of a knife at nearly a 90 degree angle along its body from tail to head. If the scales pop up and are quite thick, continue scraping until the fish is smooth. Give it another quick rinse. (If the knife just passes over the scales you don't need to bother).
Data: fish, fish scales, scale thickness, fish smoothness. Actions: "run the back of a knife", "scrape", "rinse". Conditionals: "If the scales pop up...", "If the knife just passes over...". Loops: "continue scraping until...".
For some unknown reason I've avoided learning about Babbage, his Analytical Engine and Ada Lovelace. I've just learned that the Analytical Engine used a system of cards from the Jacquard loom as the stored program. That there's a great contemporary description of the Engine (orginally written in French). That the english translation was made by Lovelace, who added a bunch of commentary at the end and that's where her great writing is. Here it is.
The How, not the What
Algorithms are single-mindedly focused on the "how" of the process, not the "what". They're all about symbol manipulation (good discussion: what are symbols?), and the symbols can mean anything.
For a great example, consider the process of doing math in roman numerals.
Let's start with an addition problem: 23 + 58. In Roman numerals, that's XXIII + LVIII. We'll begin by writing the two numbers next to each other: XXIII LVIII. Next, we rearrange the letters so that the numerals are in descending order: LXXVIIIIII. Now we have six Is, so we'll rewrite them as VI: LXXVVI. The two Vs are the same as an X, so we simplify again and get LXXXI, or 81, as our final answer. (We can check this answer using Arabic numerals.)
Now let's try another addition problem: 14 + 17, or XIV + XVII. Notice that the I in XIV is being subtracted, so this problem is going to be a little more complicated. We begin the way we did before, by writing the numbers side by side: XIV XVII. The subtracted I in XIV cancels out another I, so we cross them both out: X I V XVI I . Next we put the remaining letters into the right order: XXVVI. Simplifying gives us XXXI, or 31.
Notice that you can follow the algorithm moving the letters around and combining patterns without even having to know what the letters stand for.
The same is true for arabic numberals, but we're so programmed to "see" the quantity ten when we see "10" that it's very hard to get back to simply being aware of the digits as symbols. Kind of like blurring your eyes, or trying to hear English as a foreign language. Blur your mind.
In semiotics, a sign is generally defined as "something that stands for something else, to someone in some capacity." (Marcel Danesi and Paul Perron, "Analyzing Cultures"). It may be understood as a discrete unit of meaning, whether denotative or connotative. Signs are not just words, but also include images, gestures, scents, tastes, textures, sounds — essentially all of the ways in which information can be processed into a codified form and communicated as a message by any sentient, reasoning mind to another.
According to Saussure (1857-1913), a sign is composed of the signifier, and the signified. These cannot be conceptualized as separate entities but rather as a mapping from significant differences in sound to potential (correct) differential denotation. He emphasizes that the relationship between a sign and its extralinguistic denotatum is arbitrary. There is neither a natural relationship between a word and the object it refers to, nor is there a causal relationship between the inherent properties of the object and the nature of the sign used to denote it.
Charles Peirce (1839-1914) proposed a different theory. Unlike Saussure who approached the conceptual question from a study of linguistics and phonology, Peirce was a Kantian philosopher who distinguishes "sign" from "word", and characterizes it as the mechanism for creating understanding. The result is not a theory of language, but a theory for the production of meaning that rejects the idea of a stable relationship between a signifier and its signified. Rather, Peirce believed that signs establish meaning through recursive relationships that arise in sets of three. The first three distinct components he identifies were:
- object: anything that can be thought, whether as a concept or thing, so long as it is capable of being encoded in a sign;
- representamen: the sign that denotes the object (cf. Saussure's "signifier"); and
- interpretant: the meaning obtained by decoding or interpreting the sign which may be:
- immediate, i.e. the denotative meaning,
- dynamical, i.e. the meaning actually produced by the sign, or
- final, i.e. the meaning that would be produced if the sign were properly understood.
It is now agreed that the effectiveness of the acts that may convert the message into text (including speaking, writing, drawing and physical movements) depends upon the knowledge of the sender. If the sender is not familiar with the current language, its codes and its culture then he or she will not be able to say anything at all, whether as a visitor in a different language area or because of a medical condition such as aphasia (see Roman Jakobson). Modern theories deny the Saussurian distinction between signifier and signified, and look for meaning not in the individual signs, but in their context and the framework of potential meanings that could be applied. Such theories assert that language is a collective memory or cultural history of all the different ways in which meaning has been communicated and may, to that extent, be constitutive of all life's experiences (see Louis Hjelmslev). This implies that speaking is simply one more form of behaviour and changes the focus of attention from the text as language, to the text as a representation of purpose, a functional version of the author's intention. But, once the message has been transmitted, the text exists independently.
Spock: Computer. This is a Class-A compulsory directive. Compute, to the last digit, the value of pi.
Magic tricks make good algorithms.
algo in Spanish means adv. somewhat, a little, pron. something.
Turing machine: A useful thought experiment that led to the Church-Turing Thesis, but not all that helpful in understanding how computers work.
The world gets a lot more interesting when you pit algorithms one against another. Thus was born Game Theory. I think a lot of fun learning can be had this way. RobotWar?
Synonyms for algorithm: strategy, game plan, policy, tactic, set of rules, laws, instructions, doctrine, method, formula, function, code (as in code of conduct), recipe, process, sequence, directions, procedure, description
Cool. I didn't know the history of "Alice and Bob" examples until I came across this wikipedia article.