A variety of experiments related to the Turing test have been carried out, and there are now computer programs that can systematically convince many humans that they are conversing with another human...

People and Computers Compared

Joe Malkevitch
York College (CUNY)

Introduction

Humans—homo sapiens—often compare themselves to other species such as dolphins, whales, elephants, chimpanzees and cephalopods. Humans now know that these species exhibit intelligence and in some cases display the use of tools—one of the things that humans do that we prize as being special. However, other species do not communicate with each other using written language, nor do they appear to muse over which triples of positive integers $a$, $b$ and $c$ satisfy:

$$ a^2 + b^2 = c^2.$$

(Such triples of integers are called Pythagorean triples.)

While there have been some claims that species other than humans can count, even if this is true to some extent they do not seem to do mathematics in the sense that is necessary even to write down the equation above.

To assist with doing mathematics, including arithmetic, humans have developed mechanical devices, calculators and computers to assist them. Many years ago, for example, the abacus was developed to speed up the process of doing calculations with numbers accurately.

A drawing of an abacus Diagram of an abacus, courtesy of Wikipedia.

Today, we might use a supercomputer to enhance the speed with which a mathematical calculation can be done.

A view of a portion of the Aurora exascale supercomputer at Argonne National Labs shows racks and part of the water cooling system.
A portion of the Aurora exascale supercomputer at Argonne National Labs. Photo by Michael Linden, CC BY-NC 2.0.

The area of artificial intelligence (AI) has evolved rapidly and raises questions about what humans can and cannot do that machines can also do. Here I will consider a small part of the history of mathematical interaction with machines and computers: the theory that was developed by the mathematicians Alonzo Church and Alan Turing. Church and Turing interacted as a teacher and his doctoral student, as well as in the role of fellow scholars working in the area of mathematical logic.

Algorithms past and present

One fundamental issue in computer science is the notion of an algorithm. Roughly, an algorithm is a step-by-step procedure that one can carry out to accomplish a particular goal. Carrying out an algorithm is a bit like following a recipe for, say, baking a cake. To make a cake one assembles ingredients and carries out a collection of instructions, step by step. Now, consider the following problem: Given the numbers 80 and 145, find $a$, the largest positive integer that divides both of these numbers, and $b$, the smallest positive integer that both of these numbers divide. One might want to design an algorithm that would solve these two specific instances of the questions of finding the greatest common divisor and least common multiple of two integers.

Thinking about these two specific problems may raise some general issues about step-by-step procedures that answer a more general or abstract question. For example, perhaps you have a procedure where if there is a solution to a problem, the procedure stops after a finite amount of time with the answer. If the procedure runs for 68 hours, you might become nervous about whether perhaps the procedure will never stop, or if it does stop it will do so after you are no longer alive!

Though much research in modern computer science involves questions of how data and algorithms are represented in a computer, the notion of algorithm is far older. Here are general statements of two problems that we've already discussed in this column.

Example 1: Fix a positive integer $N$. Find, all positive integer Pythagorean triples $a$, $b$, and $c$ that satisfy $a+b+c=N$. (Recall this means that $a^2+b^2=c^2$.)

Example 2: Given two positive integers $a$ and $b$, find the greatest common divisor of $a$ and $b$ (the largest positive integer that divides both $a$ and $b$) and the least least common multiple of $a$ and $b$ (the smallest positive integer that both a and b divide).

You may be familiar with the name of Euclid of Alexandria (325 BCE—265 BCE) from having studied what is often called Euclidean geometry in school. But perhaps you are unaware that it was the same Euclid who found one method of constructing Pythagorean triples and developed an algorithm, now usually called the Euclidean Algorithm, which finds the least common multiple of two positive integers and the greatest common divisor of two positive integers. As a bonus, the Euclidean algorithm can be used to look at when the linear equation $ax+by=c$, where $a$, $b$, and $c$ are positive integers, has an integer solution for the variables $x$ and $y$. (In this situation $x$ and $y$ can be zero or negative.) This type of problem is now known as a diophantine equation, in honor of Diaphantus (flourished 250 CE), another Greek heritage mathematician of antiquity, who seems to have also lived in Alexandria in Egypt.

Alonzo Church and his mathematical milieu

One of the pioneers of computer science as an academic discipline was the mathematician Alonzo Church, who is usually classified as a logician within the areas that mathematicians call their specialties. (The American Mathematical Society has a classification system for published research in mathematics that is updated every ten years. The current version of this system has 63 broad categories, including number theory, logic, and mathematics education, and many subdivisions within these broad categories.) Church was born in Washington, D.C. Eventually he made his way to college at the University of Princeton (New Jersey) and did his graduate work in mathematics at Princeton as well.

Church wrote a thesis entitled Alternatives in Zermelo's Assumptions. This thesis was carried out under the distinguished geometer Oswald Veblen who in turn had studied mathematics under the direction of E.H. Moore at the University of Chicago. The title of Church's dissertation invokes the name Zermelo, referring to Ernst Zermelo (1871-1953) who got his advanced degree in 1894 from the University of Berlin.

Photo of the mathematician Ernst Zermelo Photo of Ernst Zermelo, courtesy of Wikipedia.

Zermelo's name is forever associated with Zermelo-Fraenkel set theory—an axiomatization of set theory that was the subject of Church's dissertation. (Abraham Fraenkel (1891-1965) was born in Germany but later lived in Israel and became the first Dean of Mathematics at Hebrew University in Jerusalem.)

Photo of the mathematician Abraham Fraenkel
Photo of Abraham Fraenkel, courtesy of Wikipedia.

After finishing his dissertation, Church spent two years traveling with the support of a National Research Fellowship. He returned to Princeton in 1929. Many research threads during this period were following up on ideas and problems that were developed by David Hilbert (1862-1942) and Kurt Gödel (1906-1978). Hilbert raised many questions about systems of axioms (rules) and what they accomplished. A fundamental property of a system of axioms was whether or not it was consistent. Might it be possible to take a rule system and show that statement A and the negation of A both held in that system? Meanwhile, mathematics was abuzz with the work that Kurt Gödel had done in showing that in any axiomatic system that met a certain threshold of complexity, there were statements that one could meaningfully write down where it would be impossible within the given axiom system to use the laws of logic to prove that the statements were true or false. Church was stimulated to understand the ramifications of ideas of Hilbert and Gödel and pioneered investigations related to various implications of this work.

Photo of Kurt Goedel
Photo of Kurt Gödel, courtesy of Wikipedia.

Photo of David Hilbert
David Hilbert, photo, courtesy of Wikipedia.

Remarkably, Church had 36 doctoral students, nearly all of whom received their degrees from Princeton over the period from 1930 to 1988. While many researchers have many doctoral students, the set of Church's doctoral students is remarkable in terms of how famous they became as researchers themselves, and sometimes—as in the case of John Kemeny (1926-1992), one of the inventors of the programming language BASIC—famous in a broader context. Most of Church's students would be described as logicians and some of them, like Church and his student Alan Turing, were innovators in and contributors to the emerging field of computer science. These students include:

  • Stephen Cole Kleene (1909-1994)
  • Alan Turing (1912-1954)
  • Leon Henkin (1921-2008)
  • Martin Davis (1928-2023)
  • Harley Rogers, Jr. (1926-2015)
  • Michael O. Rabin (1931-present)
  • Dana Scott (1932-present)
  • Simon Kochen (1934-present)
  • Gerald Massey (1934-2024)
  • Joel W. Robbin
  • J. Barkley Rosser (1907-1989)
  • Raymond Smullyan (1919-2017)
  • John Kemeny (1926-1992)

Church eventually became interested in a wide variety of problems of interest to computer scientists. In particular he developed various ideas about what kinds of numbers could be computed by a computing machine.

Alan Turing

Alan Turing was born in England and displayed talent and interest in mathematics as a youngster. He pursued undergraduate work at Kings College finishing his degree at Cambridge in 1934. However, he chose to continue his studies in the United States at Princeton, where eventually (1938) he got a doctorate degree under the supervision of Alonzo Church. Turing wound up writing a thesis under the direction of Church, which was completed in 1938. The title of this thesis work was Systems of logic based on ordinals.

Photo of the mathematician Alan Turing Photo of Alan Turing, courtesy of Wikipedia.

Work of Turing led to the notion of something that at first examination might seem to be a very primitive device for doing computation, which today is usually called a Turing machine. A Turing machine consists of a tape with cells on it that can be advanced or moved back to a neighboring cell along the tape. The device can place a symbol $X$ into a cell or erase an $X$ in a cell from a previous state of the machine. The machine also comes with a set of instructions that allow the machine to carry out a particular task. Turing discussed the notion of a universal Turing machine, a machine that could use other Turing machines as part of the tasks that it could carry out. A universal Turing machine could carry out any task that a modern sequentially structured computer with an internally stored set of instructions (or computer program) could carry out. Church and Turing used these ideas to formulate the following principle:

Church-Turing Principle (Thesis). All effective methods to compute values of a function defined on the positive integers are equivalent.

In particular, if a function can be computed effectively, it can be computed by a Turing machine.

The path that Turing might have taken as a scholar who had earned a doctoral degree in mathematics was altered by political events when Britain was drawn into World War II. Turing became involved in an elaborate attempt to try to use information from German and Italian communication activities to gain military advantage. Monitoring communications traffic sometimes can allow one to infer tactical or strategic information about one's enemy's plans. One might even actively intercept communications to get specific information about one's enemy's plans. As a defense, military messages are often encrypted (their content hidden by using codes or ciphers). The British established an elaborate team of individuals, many with training in linguistics and mathematics, to break the Axis Powers' secret codes. The physical location of this team was in Bletchley Park, a location about 50 miles from London.

Photo of a row of buildings at Bletchley Park
Bletchley Park, in England, courtesy of Wikipedia.

Not surprisingly, the activities at Bletchley Park were carried out at a high level of secrecy. One of the most important achievements at Bletchley Park was the reconstruction of the way that a device to encode messages so that they were secure worked. This device was known as the Enigma. (For more about the mathematics of the Enigma, and analyses of it made by Polish mathematicians, see a series of Feature Columns from 2009, June 2013, and December 2013.)

Photo of the Enigma machine showing keys and rotors
Enigma Machine, courtesy of Wikipedia.

The Allied cause during World War II was advanced when, due to fortunate circumstances, a physical Enigma Machine fell into the hands of the Allies. Though there were different variants of the Enigma machine used by different groups of Axis military forces, Turing and his team constructed both conceptual and physical approaches to the Enigma machine and its inner workings. When the exact text of a German military message was intercepted, often a decrypted version of the message could be obtained quickly. Sometimes when this was done, the British could alert its troops in the field and its cities at home to mitigate the harm planned by Axis military actions or bombings. In fact there is a story, whose accuracy remains unclear, that the dilemma arose of whether a warning should be given to a city intended for a German bombing mission. If preparations had been taken to mitigate an attack, the Germans might have realized that their encryption was being broken. The moral decision of not giving a warning to a bombing had to be weighed against the future benefits in other situations of the British being able to break German encryption without the Germans knowing this.

What went on in Bletchley Park only became widely known after World War II, but even today (2024) there is information about the war efforts in cryptography, cryptology and the people who were involved and how they were involved that has not been made fully public. However, it is certainly known that Alan Turing played a critical role in breaking the ciphers that the Germans thought were unbreakable.

Alan Turing was a gay man. During much of the twentieth century, sexual contact between men was illegal in Great Britain. (Similar laws prevailed in parts of the United States until the landmark Lawrence v. Texas Supreme Court case in 2003.) In 1952, Turing was charged with a crime due to his relationship with another man. He pleaded guilty and, rather than serve time in prison, was given probation in exchange for participating in a yearlong course of hormone therapy. This therapy resulted in physical changes that depressed Turing. Turing died in 1954. The official cause of death was suicide, though not all biographers agree with this assessment. Many years later, the British government issued a formal pardon for Turing's conviction, describing the sentence as "unjust and discriminatory".

During his brief life, Alan Turing made dramatic contributions to mathematics and to a field, computer science, that barely existed at the time that he did his research work.

The Turing test

Within the broad spectrum of issues that computer science investigates is the area that has come to be called artificial intelligence (AI). Broadly speaking, AI is concerned with getting machines to do things which when done by humans require "intellectual skills." The earliest calculating devices allowed the person using the device to carry out arithmetic calculations involving complex arithmetic problems or the evaluation of mathematical expressions involving logarithms or trigonometric functions. While determining the value of $7^{20}$ might be possible by hand, using a calculating device might be easier and perhaps more reliable. Also the games of checkers, chess, and Go are viewed as environments where one has success not due to luck but to intelligence. However, when calculating devices or computers perform complex tasks they do this because they were programmed by humans and they cannot think or problem-solve of their own volition.

Alan Turing in 1950 proposed that a framework in which to see if a computer illustrated intelligence. Though there are disputes about exactly what Turing intended and the rules of the interaction Turing was thinking about, he developed what is now often called the Turing Test. In essence, there was a person $X$ who queried the occupants of two rooms, one of which housed a computer and the other a human. The goal that $X$ had was to ask a series of questions of the occupants of the rooms so as to determine which of the rooms had the computer and which had a human. If $X$ was unable with the questions posed to tell which room had the human and which room had the computer, then the computer was to be deemed "intelligent" or at the very least doing an impressive job of mimicking a person. A variety of experiments related to the Turing test have been carried out, and there are now computer programs that can systematically convince many humans that they are conversing with another human.

Systems are now available that can create essays, videos and photos that while "fake," in the sense that they are computer generated but are extremely difficult to distinguish from "real" essays, photos and video. This development has created a variety of political and moral issues. Mathematicians and computer scientists are working to try to deal with the consequences of this reality.

Further reading

  • Hodge, A., Alan Turing: The Enigma (1983)
  • Olinick, M., Simply Turing (2020)
  • Dyson, George, Turing's Cathedral (2012)
  • Goldstein, Herman, The Computer from Pascal to Von Neumann (1972)
  • Kidder, Tracy, The Soul of a New Machine (1981)
  • McCartney, Scott, Eniac: The Triumphs and Tragedies of the World's First Computer (1999)

Leave a Reply

Your email address will not be published. Required fields are marked *

HTML tags are not allowed.

53,403 Spambots Blocked by Simple Comments