Philosophy of Religion Topic: Concept of a Concept part II

Article #283
Subject: Concept of a Concept part II
Author: Andrew W. Harrell
Posted: 5/21/2016 07:11:12 PM

Concept of a Concept

Part II

V

A short history of developments in the concepts of mathematical logic

A Concept is a data object (which is the result of a functional computation)

After the writings in the 1760’s of Immanual Kant it took some time to translate his metaphysical an
epistemological insights into scientific and theosophical application. In the 1890’s the English
mathematician/Anglican priest George Boole invented a way for algebraists to do logic. And, the
German mathematical philosopher Gottlieb Frege wrote his seminal paper on algorithmic computation
called “concept writing”. For them mathematics was the study of what was invariant logically and
conceptually under change of notation. Was it about the real world? Yes, in some sense, but as Kant
had pointed out it is impossible for us to know this “real world” completely while we are doing
mathematics about. So, in another sense it was also true that the mathematics they were doing was,
“not about the real world.” But about our mental constructs of it.
So, what is the new way of thinking about concepts as functions and computers? For more than a
thousand years Aristotle’s logical and definitional schemes of logic dominated the fields of study in both
philosophy, science, and mathematics. According to the traditional logic and philosophy of Aristotle
formal reasoning followed from four possible forms of judgment (the universal affirmative judgment, all
of A is B, which we will denote as an A, the particular affirmative judgment, some of A is B, which we will
denote as an I, the universal negative judgment, None of A is B, which we will denote as an E, and the
particular negative judgment, some of A is not B, which we will denote as an O). Along with these four
type of judgment there were four figures of syllogistic inference. If we break each syllogism into a
subject S, predicate P, and middle term M, the resulting four figures of syllogistic inference can be
denoted:

MP PM MP PM
SM SM MS MS
SP SP SP SP.
Based on purely combinatorial grounds this gives 256 different kinds of syllogisms .If we let the
unknown X mean is S, the unknown Y mean is M, and the unknown Z mean is P, and express the above
syllogistic forms in the predicate Boolean calculus of English mathematician George Boole the premises
in the first line above can be written:
| ‘Y v Z| ; |’Y v ‘Z| ; |’Z v Y| ; |’Z v ‘Y| where ‘Y means not Y, X v Y means X or Y, and |X|
means the truth value of the variable X. eg |’Y v Z| means the truth value of not Y or Z ( which is the
same using the normal definition of implication as the truth value of (Y implying Z).
The bottom two lines can be written:

| ‘Y v X| ; |’Y v ‘X| ; |’X v Y| ; |’X v ‘Y|

And
| ‘X v Y| ; |’X v Z| .

Using, this symbolic notation it is possible to that the truth or falsity of all 16 of the possible Aristotle
syllogisms may be completely represented and tested in this Boolean functional algebraic formalism .
But, now the question arises does the list of Aristotelian possibilities exhaust all the Boolean ones?
Take the sentence “If there is a son, then there is a father.” for a counter example to asserting this in all
situations. Let X stand for is a son and Y for is a father. Translating the sentence into our notation we
have:
‘| X | v | Y |
But, this expression can have many different truth values depending what X and Y stand for. And, there
is no way using this notation only to capture the relationship between what X and Y stand for into the
one argument Boolean algebra formalism. What is needed in terms of a formal to device which
expresses the meaning of the statements “is a son” and “is a father” adequately is what is called a
logical predicate with two arguments. We would then have predicates “is_son(X,Y)” and
“is_a_father(X,Y)” with the logical axiom connecting and defining the two predicates, “is_son(X,Y)” if
and only if “is_a_father(Y,X)”. X If and only Y being is a short hand notation for the assertion that X
implies Y and Y implies X. A relation is not a uniquely determined mapping from a domain to a range like
a function, it is just a set of order pairs (X,Y) where X and Y run through different sets. Given a value for
X there may be many different Y’s that the relation is defined for. But, given a value for X and Y the
relation has a value of true or false for that ordered pair.

Gottlieb Frege followed Pascal’s and Immanuel Kant’s suggestions and make the first attempt to try and
give us a precise logical definition of what had already been defined as a function by Rene Descartes.
He defined a function as a mapping [correspondence] between metaphysical objects in one realm and
another that could be shown to be well-defined [associating some object in the functions range to each
object in its domain] and one-to-one [not associating more than one to each object]. Once we have
defined what a function is conceptually we can study them abstractly. What we later came to call
computer programs are composed of algorithms [ordered sets of rules for the operation of functions on
spaces] + data structures [ways in which we store information symbolically in the program’s memory…
for more information on what I am talking about here see the influential 1976 book on computer
programming with this title by Niklaus Wirth]. Concepts as we explained earlier are the form in which we
humans choose to transmit and work with our self-developed and God given knowledge. So, when we
say that we can consider a concept as a computer programming we are claiming that this program is
not just a means or tool that we use to answer questions, search our memories but has value in itself
and is a form of our knowledge. So, with this way of looking at things [and our knowledge of them] I
think that we somewhat closer to considering what we are trying to understand about how we actually
understand sense information, our thoughts about this sense information and when we actually can
make warranted beliefs or judgements about whatever truth is in them. Some of these judgements will
be ethical and subjective, some objective and scientific, some philosophical and mathematical. But, in
all three cases we want to try and understand the functional processes in our minds and brains [which
may not necessarily be operating on/with the same data…realities or substances] which cause us to
come to reach these conclusions [judgements about knowledge and the creation of new knowledge for
us].

Third definition of a concept----

A concept is a data structure. That is, it is a predefined set of object type. These types can be
frames with slots [classes], words, numbers, lists, streams, or variables. In certain situations theys can
be recursively defined. But, the final tree structure is usually limited to have only a finite number of
branchs. The information it contains is the values or attributes of the objects that the data structure
describe.

Some Definitions of Terminology Related to These Ideas

Attribute -- Defines the qualities or values contained in a class and the type of information that make up
a class. For example, the class car can have the attributes "type of engine" and "top speed".

Attribute value -- An actual number or confidence factor representing the degree of certainty with
which a factor is known.

Class -- Defines the structure (in terms of its attributes) and behavior(in terms of its associated
methods and procedures) of an object. When it becomes an instance, it then holds the actual data
values of a particular realization of this type of object in the knowledge base. For example: a class
called human beings might have attributes related to the parts that differentiate our physical beings and
categories such as those related to its our mental and spiritual capacities. Some of the associated
methods and procedures of this class could be thinking, talking, walking. It can be considered as a
subclass of another class such as the class of living beings. The author and the reader are both specific
instances of a human being object.


Forward-Chaining -- Forward-chaining reasoning is an inferencing strategy in which the questions are
structured from the specific to the general. That is, it starts with user supplied or known facts or data
and concludes new facts about the situation based on the information found in the knowledge base.
This process will continue until no further conclusions can be reached from the user supplied or initial
data (using the rules and methods coknowledge base). (See the previous section for a more complete
explanation and an example).

Instance or Instantiation -- Specific occurrence of an object or a predicate. An object consists of its
class structure, which defines its attributes and behavior and its instances, which hold the actual values
of the object. Thus objects are instantiated (or defined) by a process of forward chained reasoning in
which the attribute slots are given values. An instance of the class human beings mentioned above
would refer to an individual person, such the reader of this report. Predicates are instantiated or
defined by a process of backward chained reasoing in which the arguments (some of which may be
recursively defined) are unified with previous facts or postulates.

List -- An ordered set of objects tied together one to another. Its length is not predefined, but it does
have a first and last element.

Number -- An integer or real number.

Object -- General term for a programming entity that has a record type data structure along with
attribute values and procedures or methods that enable it to represent something concrete or abstract.
It can be contrasted with other programming entities such as facts, rules, procedures, or methods. An
object's structure is defined by its class and attribute definitions. A class declaration is a data template
involved in representing knowledge which defines the structure of an object. For example, in the class
"human being" mentioned above some of the slots might be height and weight.



Recursion -- A process by which a data type, predicate or function is defined in terms of itself. This
situation of self relation allows the function or predicate to be computed in an orderly manner.

Stream -- An ordered set of objects tied together one to another. It has a first, but not necessarily a last
element.

Variable -- A name that represents the value of an unknown object.

Word -- A name.




OBJECT-ORIENTED ALGORITHM TO GROUP OR CLUSTER SETS
OF EXAMPLE DATA IN CATEGORIES

Start off with an initial set of clusters which
partition a set of sample data into groups.

For each example in a series of new data samples:

a) Compute the mean or centroid, or some other mapping
that quantizes or compresses the groups of data sample into
a small number of values.

b) Place each new example in the cluster which most
closely matches (resembles or is contiguous to) the initial
categories.

Recompute some measure of overall effectiveness of the
classification, such as:

Sum of squares of residual difference from each sample to
the cluster’s quantization value.

Iterate on all the new examples until this measure cannot
be improved further.

Quantization --- The process by which a set of data is
partitioned into parts of segments. This process proceeds from
the bottom up using distance measures to separate the whole
space into disjoint parts. Each part has a number or set of
values associated with it. So, when a new object is examined
it may be identified by the value of certain attributes


Aristotle considered sense information as the primary
set in creating our knowledge. He understood that this process
of quantization must occur before the algorithm of rule-based
conceptual identified (which was explained above) can occur.

It is an error to disregard this type of knowledge by
saying that things are quantized by their relation to causes.

The philosopher David Hume was the first to understand that
because this algorithm works with regard to resemblance and contiguity it is an entirely different
method than the earlier backward chaining algorithm.

But, also, in this midst of this learning process the opposite mistake can occur: Things are not
related to causes and definitions (essences) because the forward chaining algorithm above says they
are. But, the backward chaining rule-oriented algorithm to be explained below can be used
to learn how to separate out the parts which have already been
quantized from each other.

The philosopher Immanuel Kant has explained
to us that the way we as humans choose in the above manner
to group together categories of thought can itself influence the way we know things. Thus, knowledge,
although it is something which uses definitions (forms or essences) separate from us, is also something
which depends to some extend on structures or clusters of categories of thought that are inside of us.
And so, the sentence:

“We hold these truths to be self-evident,
all men are created equal”

can be interpreted to mean that we ourselves contain what Aristotle called the “potency” in our minds
for this statement to be true.


VI

CONCEPTS AS RELATIONS.
HOW CONCEPTS CHANG,
WHILE REMAINING THE SAME,
INSIDE OF THE STREAM OF CONCIOUSNESS
IN OUR MINDS


This way of looking at concepts assumes that
objects are appearing to us and the intellectual faculties in
our minds are able to identify and unify how the objects fit
together as time progresses.



METHODS AS CONCEPTS ----


While rules are used for backward directed goal-oriented reasoning, objects and recursively defined
data types are appropriate for building up forward directed production systems, models are appropriate
for procedural oriented, cased-based reasoning.

A model deals with some topic, a pattern of behavior, a procedure for accomplishing a taks, an overall
type of reality (World view).
A paradigm or case is:

1) a way of looking at a body of facts

2) an example, a particularly good example

3) a pattern, an all encompassing pattern.

One can mistake a paradigm for a theory - in the same way one can mistake a series of examples for a
definition. A good example ( a paradigm) can serve as a model for the interpretation of a body of facts.
However, when it becomes a model it becomes capable of being displace. Remaining a paradigm, an
example of a way of looking at things, it stands for what it is.


A conceptual model (as opposed to physical one) is something that exists in the mind. It is envisaged
and/or specified without actually being all that it represents. It may be required to simulate what one is
interested in, i.e. a topic, pattern of behavior, tasks, or World view. The better it simulates reality the
better a model it is.

4th Definition of a Concept--- A concept is a model (involving the essential parts of a series of cases or
examples) along with a program to learn, retrieve, identify, the concept (knowledge). The program has a
data structure part + an algorithm (interpretive procedure) part. The algorithm may consist of a set of
rules, as in an classification type expert system. It may be all the statements which are derivable from a
set of axioms involving predicates with variable terms and ground instances of facts. Or, it may be a
pattern identification routine, such as in a neural network.


--- A model can be of a set of logical relationships. In this case it is an interpretation of a set of
sentences which it satisfies A concept is a logical interconnection between facts which can be
interpreted as a model of thought,[see next section].

--- A model can be of the way something works. It can use data structures to construct a program to
simulate what something does.
It can be used to test whether facts fit into this viewpoint. And, in this sense it has a type of semantic
truth (A. Tarski) associated with it. It has a mapping (mapping is used here in the mathematical sense)
of truth tables which enables us to test whether any statement about the World can be true in some
interpretation of the model.

--- A model can be a way of looking at the World [a physical theory, ethical theory, philosophic theory,
religious theory]. In this case there is a interpretation [mapping] from the objects in the World to a set
of facts, constants, variables. There is a translation of physical laws, ethical, philosophic, religious
beliefs and postulates into rules connecting those facts, constants and variables.


IMPORTANT NOTE: The more we want to talk about knowledge related to the World and the
less about knowledge related to ethics, philosophy, theology, the more we need to introduce numbers
and their language mathematics into the propositions we write. For example, once we have numeric
data types, we can go from simple verbal propositions (qualitative judgements) to statements involving
numeric values (quantitative judgements). We can then make a philosophic classification type expert
system of the type given in the discussion of the second definition of a concept, into a quantitative
case-based reasoning tool.


--- In a classification type expert system, the model involves a data structure as in the 3rd definition of
a concept. It also involves some means of retrieving the information, along with a way of creating the
rules as in the 2nd definition of a concept.

How do we create the knowledge tree that we use in such a quantitative case-based reasoning expert
system? For the philosophic expert system we asked a series of questions from the general to the
specific about something we believe that we already have in the mind. Now, for this, we need a series of
examples or cases in order to develop cutoff values of object attributes in order to branch into the
knowledge tree. Again the questions go from the general to the specific. But how do we know which
attributes are general. Answer: we can construct statistical summaries and tables that analyze the
examples in the data to determine which attributes are most associated which the particular results we
are interested in. These attributes are then said to be the most general, because they determine the
first questions we need to ask in constructing a classification tree.

ALGORITHM TO CLASSIFY OR IDENTIFY AN OBJECT BY THE
EXPERIENCE WE HAVE OF IT

1) Construct a neural net consisting of a layer of input nodes or neurons, a hidden layer of
interconnecting nodes, an ouput node (on-off) or neuron, and an activation function function for the
output node, and a feedback procedure to adjust the probabilities of firing for the interconnecting
nodes.

2) For each sample in a set of data which both belongs and doesn’t belong to a given class of
objects:

a) compute the value of the output, on-off node for that sample data input value and
adjust(feedback) the values of the connecting weights associated with the hidden layer nodes so that
the output is true according to whether the data does or does not belong to the given class.

NOTE: This learning procedure does not require that the sample space be partitioned into (object-
oriented) categories first. Nor, does it require that there be a top-down identification tree of the twenty
question type. However, this procedure can be used in combination of one or another of the other
procedures.

As will be explained in the next section the order of the steps in this algorithmic approach may be
used not only for neural network type classification programs, but also for logic programming expert
system classification systems. These programs ask a series of questions to an outside person running
the program (human expert) and based on his or her answers, along with the set of rules built into the
system output a set of conclusions as each rule in the ruleset fires. According to a theorem proved by
Jacques Herbrand the logical system represented by the statements the rules make can, under most
circumstances (where there is a finite set of rules and the rule base logic is monotonic) be assumed to
be certain type of simplified statements called clauses.

VII

A CONCEPT AS A LOGICAL RELATIONSHIP....



Given that we can compute with data structures and algorithms to get a conceptual model of a
functional and recursive program, how can the extent or scope of the form of this knowledge be
extended? Many logical problems can already be solved with the first four types of knowledge
representation that have been discussed:

1) The solution to logical problems such as: Jim is the Grandfather of Sue, Kathy is the mother of
Jim. What is the relationship of Sue to Kathy?

2) How do we find the river that flows through Missouri, Arkansas, Mississsippi and Lousianna?

These questions can be answered using the ideas already mentioned.
But, there is more to the language of what is called first and second order mathematical logic:

1) What do we mean when we say that a statement is false, or that the set of its solutions [variable
identifications that satisfy it] is empty? How can we determine if this is the case?

2) What do we mean when we say that a statement is always true no matter what values the
variables in its expression take?

3) What is a statement in predicative logic? How is it different from a mathematical function?

4) How does this difference effect the way we compute solutions that satisfy a set of statements?
How do we keep track of the partial solutions (store the data) so that we can explain how the conclusion
was reached?

5) How can we best compute the set of all ways of satisfying a predicative statement?

These problems require that we introduce quantifiers (there exists, for all) into statements. We also
have to better explain how the data sets in variable substitutions match into themselves and other
different pattern identifications.

5th Definition of a Concept ----- A concept is a logical relationship involving a predicative
statement (subset of n times Cartesian product of the domain values and variables, instead of just a
functional mapping). This logical relationship may also involve the question of the satisfaction of the
concept (truth in terms of a specific knowledge representation). It may also involve the notion of a set
of variable identifications in some model [data + algorithm] . And, it may also involve the notion of how
a method for determining truth searches through the space of variable identifications inside of a pre-
determined set of program search rules [logic + control] as a part of determining what the algorithm
used will be.

From an implementation standpoint a recursive function such as those defined in the previous
section means a function along with a stack [ array of data in memory ] to hold the variable
identifications, return point, and so forth. A predicate is equivalent to a recursive function along with its
stack along with another memory area to hold the goallist, partial solution, partial variable
identifications, binding arrary, backtrack points, and so forth.

From a hardware requirement standpoint, the knowledge representation required for this definition
of a concept is already covered in the idea of a stored memory programmable computer [deterministic
Turing machine] and explained in the 4th defintion of a concept. However, if we believe that ideas and
the way we organize these ideas in our thoughts have a reality in themselves, then these new ways of
representing are something different. If this is so, then there such a thing as a concept of a concept.

It is useful at this point to try and explain some of the details of these questions which may not have
been clearly understood over the last several thousand years. Fairly recently, it has become clear, that
objects and predicates are defined using quite different data structure and computational procedures:


Some more definitions and terminology that form examples of the observations concerning the
second definition of a concept and its limits


Warning -- there are two parts to every definition;

-- a rule to identify the object;

-- the assertion that the rule is adequate.

The second part is a "hidden assumption" that all definitions contain. We agree beforehand
that we know what we are talking about.

-- there is no guarantee that the [rules] [marks] won't need to be changed later.

-- the terminology "if and only if" exists only in the imagination.

When a computer goal-search search program is designed using a set of rules we can try to satisfy the
rules using either a bottom-up, forward chaining strategy, a top-down, backward chaining strategy, or a
combination of both bottom-up and top-down goal search using both forward and backward chaining. If
in the rules that set up the definition of terms a negative possibility is not allowed in one of the clauses
then we won’t ever be able to know whether the search was not satisfied…and hence the only if part of
the definition of the term is either. When the rule based classification system asks its series of
questions and dynamically enters information into its object-oriented database as a result of the answer
satisfying the conditions and hypotheses of the rules it may reach the end of the rule set before all the
questions that need to be asked have been (see the paper at the US Army Research Office’s 1994
Conference on Computing by Harrell which has an example of how this happens in a 41 rule expert
system to classify river bar creation and stream,river bed erosion ). This happens because the logical
system is what is called non-monotonic ( that is what logical assertions and theorems are proved in the
system may depend on the order in which the data and rules are instantiated inside of its logical
predicates). In this case the rule set needs to be sorted in terms of the order in which the conclusion
nodes and predecessor nodes are entered in the systems knowledge tree. The algorithm below does
this:

ALGORITHM TO TOPOLOGICALLY SORT RULES IN AN EXPERT SYSTEM

Start) For the whole set of nodes of conclusions in the rules:
a. If every conclusion node has a predecessor, then stop. The rule-based system has a cycle
infeasible (that is, a partial order cannot be defined on it).
b. Pick a node V which has no predecessor.
c. Place V on a list of ordered nodes
i. When the nodes conclusion is assessed, if a terminal goal node is reached, print out the list of
rules used on the way to reach that goal/conclusion.
ii. Delete all edges leading out from V to other nodes in the knowledge tree.
d. Go to the start .


A short list of some basic logic and knowledge oriented terminology is listed below :


Backward-Chaining -- An inferencing strategy that is structured from the general to the specific. That
is, it starts with a desired goal or objective and proceeds backwards along a series of deductive
reasonings while it attempts to collect the hypotheses required to be able to conclude the goal. This
process continues until the goal is reached and it then displays its conclusion. (See following sections
for a more complete explanation and an example.

Algorithm -- A procedure that conducts a calculation in an ordered manner for the purpose of solving a
problem.

Antecedent -- The IF part of a conditional statement.

Clause – A formula to be included in conditional statement which contain a goal. Clauses do not contain
the terms which are displayed functional or predicate arguments in the more standard way of writing
logical statements. If the clauses in the consequents are restricted to always being positive the clause
is called a Horn clause.

Consequent -- The THEN part of a conditional statement.

Continuous function -- Rougly speaking, a function that has no jumps or breaks in its graph.

Differentiable function -- Roughly speaking, a function that has one and only one tangent line close to
function curve (graph) at each of the points on it.

Domain -- A set of objects which form the elements from which a function maps.

Expert System -- A computer program that represents and uses expert human knowledge to attain high
levels of performance in a problem area. An expert system has two basic components: a knowledge
base which contains the information (facts, rules, and methods) found in the problem area being
represented, and an inference engine or mechanism that make use of the knowledge base (by
scheduling and interpreting the facts, rules, and methods) to make conclusions and decisions that solve
problems that would normally take a human expert more effort.

Fact -- A collection of logical relations between objects


Forward-Chaining -- Forward-chaining reasoning is an inferencing strategy in which the questions are
structured from the specific to the general. That is, it starts with user supplied or known facts or data
and concludes new facts about the situation based on the information found in the knowledge base.
This process will continue until no further conclusions can be reached from the user supplied or initial
data (using the rules and methods coknowledge base). (See the previous section for a more complete
explanation and an example).

Function -- A mathematical mapping from one set to another which associates at most one value in the
range to each value in its range.

Genetic Algorithm -- A mathematical procedure designed to provide computational searches through
the combinatorial possibilities that might led to the goal of program. Data formats are created that
represent the intermediate values of attributes involved in the search. Then some of the values are
mutated at randomly selected points in the data format in order to see whether the goal can be
reached. This approach works better normally works better than trying to represent intermediate
attributes as differentiable functions and search in the direction of the tangents to the function curves.
The reason is that partial solutions have less tendencies to get caught in valleys or depressions.

Goal --- A top-level consequent of the rules in the knowledge base toward which Backward-Chaining
may be directed. (It is a hypothesis that the program will try to determine if some group of rules can be
instantiated together to satisfy)

Graph of a function -- The set of ordered couples or pairs (x and y coordinates) of the values in the
domain and range of the function, displayed or plotted in a two dimensional representation.


Knowledge Base -- The sum total of all the facts and rules through which inferences, conclusions, and
goals may be reached. This may change as new facts and rules are added or subtracted from the
overall system.

Knowledge Tree --- A graph showing the logic and data flow connections between rules and facts in the
knowledge base. A knowledge tree presents a graphical representation of the complete structure of the
knowledge base.

Mapping -- A set of ordered couples of objects. Thus, ((1,2),(2,3),(3,4)) is a mapping from the integers
to the integers.

Range -- A set of objects which form the elements into which a function maps.


Backward-Chaining -- An inferencing strategy that is structured from the general to the specific. That
is, it starts with a desired goal or objective and proceeds backwards along a series of deductive
reasonings while it attempts to collect the hypotheses required to be able to conclude the goal. This
process continues until the goal is reached and it then displays its conclusion. (See following sections
for a more complete explanation and an example.


List -- An ordered set of objects tied together one to another. Its length is not predefined, but it does
have a first and last element.


Method --- A procedure stored in an object's class structure that can determine an attribute's value
when it is needed in the program , referenced in its class, or required to execute a series of procedures
because another value in the program changes. "When needed methods" are executed during
backward chaining to determine an attribute's value.

Node --- A vertex or point in the knowledge tree connecting the antecedents and consequents of rules
in the knowledge base. In most conventions the nodes are the rules and the antecedents and
consequents are the edges between the nodes or vertices.

Number -- An integer or real number.


Pattern Expression -- An expression containing variables an involving objects and their attributes.
These patterns in the expression contain combinations of symbols denoting constant and variable
objects. They will not normally containing predicates which have the ability to reference themselves in
their arguments.

Pattern Matching -- The process of matching a general pattern expression to an instantiation or specific
instance of an object or to another pattern expression. The process proceeds in a forward-oriented or
bottom up reasoning process.

Predicate -- A logical relation that affects one of more objects or variables. A predicate specifying a
relation between n types of arguments is usually written as a mapping (which must also be a function)
having n arguments. Predicates, as opposed, to relations may have one argument. Predicates are
defined by giving a series of logical rules which specify an algorithm for computing the value of the
function which specifies its name. Objects, as explained above, are defined by giving values to the
attributes that make up their structures or by computing these values using methods (which are usually
not recursive).

Procedure -- same as method.


Relation -- We speak of relations as holding between two things or among several things. Thus the
relation of being married holds
between a man and a woman. A relation between n types of objects is written in terms of a mapping
with n arguments.

Recursion -- A process by which a predicate or function is defined in terms of itself. This situation of
self relation allows the function or predicate to be computed in an orderly manner.

Stream -- An ordered set of objects tied together one to another. It has a first, but not necessarily a last
element.


Subgoal -- A relation, possibly involving objects of variables, which is necessary for the satisfaction of
another goal.

Unification -- The process by which a theorem proving machine (i.e. a logic programming compiler)
tries to math a goal against facts or already instantiated predicates on the left hand side of rules in
order to satisy that goal, or to determine one or more further sub-goals necessary to evaluate teh
original subgoal. The process uses backward oriented reasoning and proceeds in a top-down manner.

Variable -- A name which represents the value of an unknown object.


Word -- A name.

VII

A Twentieth First Century Mathematical Philosophy that helps us Understand what a concept is.

A concept is a way to determine a set of numbers so that we can do more mathematics with it.

“God created the integers, the real numbers, the world, space and time, the plants, the birds, the
animals, and you and me. All else is the work of humankind. “ Genesis, chapter 1 + Immanuel
Kant+Leopold Kronecker +Gottlieb Frege+ Henri Poincare+ Jacques Herbrand+Kurt Godel

At the end of the nineteenth century, Leopold Kronecker a famous mathematician and number
theorist contemporary of our friend Gottliebe Frege, made his fascinating much quoted statement, “God
created the integers, and all else is the work of man.” Henri Poincare reestablished among us of reading
books on natural philosophy written my mathematicians . These influential books would, led to the
creation of a school of mathematics which thought of concepts the way Bishop Berkeley did (see below
the explanation of who the “conceptualists” were). As mentioned above this was a time of many new
scientific discoveries and much progress in our better understanding the philosophy of mathematics.
The German school of mathematicians (some of the other famous names involved in this were
Weierstrass, Cantor) were attemping at this time to put the definition of what a “real number” on a firm
philosophical, metaphysical, and logical foundation. This problem arose along with the invention of new
methods and formulas by Newton and Leibniz used in the Infinitesimal [differential] and integral
Calculus by Newton and Leibniz. What do we mean when we say we are going to take the “limit” of a
series of “real numbers” in order to calculate what the derivative or the integral of a function is? This
new field of mathematics along with that of the algebraic geometry of Descartes spurred a new level of
human understanding and control of what was called before that the “natural sciences.” Perhaps the
most difficult and deep question that arose from this was, “Was do we mean by and how do we want to
define logical what it is for something to be a set of mathematical objects.” First of all, as discussed
above, we have to be able to define what an object is. Next, we have to be able to define how to
determine and being to know what a collection or set of these “objects” is. Of course, there are many
different ways of doing this. Philosophers had argued for millennia about what an object is [what does it
mean for something to be real]. Plato has given us one answer, Aquinas, Hume and Locke another. If we
can’t answer this question, how are we going to be able to define what a set is in order to do deductive
and inductive reasoning about it? Well, a hundred or so years later, we are still arguing about it. But, the
good news is that we have invented much of modern logic and the digital computer and the internet to
go along with the computer while trying to figure out the answer to this question.
From our viewpoint here in 2008 [reading a college textbook which was written in 1964] there
were four different metaphysical solutions which philosphers of mathematics and science have
proposed so far . These arose out of four different ways of solving the mathematical counterexamples,
new definitions about what the mathematical infinite is, philosophical and metaphysical concundrums
about what truth is… logical paradoxes about how we claim that we can understand it and teach it to
others once we have discovered it. These were pointed out by the pioneers of thought in this
fascinating area of human inquiry. They helped us know how we might better understand how God
created everything and did this by taking three steps backward and analyzing, studying and meditating
a little more carefully than anybody had up to this point…about our own reasoning processes in these
fields works Not, a bad strategy in general, is it…for anybody trying to figure out eternity and truth and
all of that. The trick is to focus more on the process and not the results.
Our viewpoint about what a possible philosophy of mathematics depends on what our viewpoint
is about how we should think about conceptual knowledge. We can be:
1) A realist, like Plato, maintaining that universals are real abstract entities, Then we have to decide
whether we think that our minds have the power to discover or comprehend these entities through
rational insight.
2) A conceptualist, like Bishop Berkeley, who believes that although universals are real they do not
have any reality in the World apart from our thinking about them. That is, we believe that they are
created within our own minds.
3) A nominalist, like the English empiricists who maintain that all knowledge is completely determined
by sense impressions and there are no such universal concepts, nor are there any abstract entities
corresponding to them in our minds.
4) Immanuel Kant argued that our knowledge of universal ideas like the idea of number rests on our
awareness of time as a pure form of intuition. He argues that this knowledge is both a priori and
synthetic. He believed in a “potential infinity” instead of an “actual infinity”.
5) Gottlieb Frege introduced an important qualification to Kant’s arguments (in its full form it was later
shown to be mathematically unsound by Godel). He asserted that the laws of numbers are all analytic
and can be reduced to logic alone. Thus he proposed we should consider mathematical concepts as
real entities [existing independently of us like Plato said but unlike Kant said when he argued that all
knowledge of them is based on empirical sense impressions and mental construct inside our own
minds]. And, he believed that therefore mathematics did not need to be thought of as talking about
‘reality’. But, he argued that the formal reality in it held the key to use for an explanation of whatever
reality we want to use it for.
Thus for him we are only justified in using mathematics when we want to discover conceptual truth in
circumstances of formal algorithmic, robotic environments such as those explained above. David
Hilbert, perhaps the most famous and influencial mathematician of the time followed Frege’s and
Bertrand Russell’s and Alfred Whitehead’s work in this area with a book written along with W.
Ackermann called “Principles of Mathematical Logic”. This book was written from a formalist and
intuitistic philosophic standard. The writers wanted to demonstrated and explain how it would be
possible to logically, consistently, and completely formal the way mathematicians prove theorems.
Russell’s and Whitehead work also had the philosophic purpose of showing how it would be possible to
axiomitize and logically prove all of mathematics (or at least of real numbers) but they didn’t believe,
like Hilbert and Kronecker that only the integers needed to be used to index what we have called
recursive functions (the people we have called intuitionists). Thus Russell and Whitehead were realists.
In the early 1930’s Jacque Herbrand in France proved his theorem that gave a way to prove the
question of the logical consistency of the integers (which Poincare had posed in his books Science and
Hypothesis and Science and Method). Then, a little after this, Kurt Godel proved his equally important
theorem that showed the impossibility of demonstrating any theory which was consistent and explained
all the properties of the integers along with the real numbers. This result effectively shut down the
possible of completing the philosophic/mathematic program of Hilbert. But, as mentioned earlier,
because these gigantic thinkers in number, philosophy, and time led us all in these mental directions,
the foundations of thought required had already been laid to invent the modern computer, micro-
processors these gadgets themselves being the primary tools that directly ushered us all into our
present day age of explosion of information and understanding about common shared treasure house of
easily assessable human knowledge (the internet).

Here is some more background terminology which will help us understand better some of the
previous thinking which led to the above philosophies of mathematical metaphysics. Immanuel Kant in
his Critique of Pure Reason introduced four ways of classifying knowledge or truth: 1) analytic
knowledge or truth which separates out a component presented as a concept that we are trying to
understand through a judgement. A statement is analytic if and only if nothing other than understanding
(ie no experiments are necessary to justify it) is required to enable one to know whether it is true or not.
2) synthetic knowledge or truth—synthesis of concepts through judgements (acts connecting concepts
or holding them together in our consciousness), 3) apriori knowledge ---that attainable prior to
experience and which does not need to be judged by it, 4) empirical knowledge or truth---that based
on experience and which requires justification from experience. In the previous section explaining the
5th definition of a concept the modern artificial intelligence terms 1) forward chaining and 2) backward
chaining can give us some more details of how Kant’s 1) synthetic, empirical and 2) analytic, apriori
ways of knowing might work.
The English mathematicians Alfred Whitehead and Bertrand Russell were the first to propose a
consistent philosophical/metaphysical system to do this.They believed they could give a better way to
define what the “set” that makes up the set of objects is. Because they were realists like Frege this
approach allowed Cantor’s infinite objects. The German mathematicians who came after Frege settled
on two different approachs. David Hilbert focused on trying to completely formalize in logic the
functional or algorithmic part of how sets or classes (a more general term for a collection of objects
than a set) were computed or determined. As many of the readers of this no this great plan was proved
to be impossible in the 1930’s by the Princeton mathematician and logician Kurt Godel. But, much of the
mathematical and logical foundations for the present day discipline of what we now call computer
science were laid before this was accepted as a basic limitation on human knowledge and
understanding using this approach. Two other German mathematicians Theodore Fraenkel and Ernst
Zermelo along with the French mathematical genius Henri Poincare developed a different approach to
what it means to prove something mathematically. Their philosophy of mathematics is called
“intuitionism” In this approach one is not obliged to accept Aristotle’s fundamental axiom which is
called the “law of the excluded middle” This law just asserts that a statement must be either true or not.
There is no third outcome which is possible. Thus, taking this point of view it is possible to in some
sense welcome the counterexamples and paradoxes that were discovered at the end of the 19th
century. By applying them to our question about “what is a set of mathematical numbers” or “what is a
set of objects” we see that it might be possible for an object or a number to possibly be in the set and
possibly not.
Nowadays, we know that this assumption has important consequences for present day discussions
about “multiple reality worlds” and whether it might be possible for us or our souls to go back in time.
For, as we proceed along our individual world lines in time and space…if we carry along as part of the
environment in the functional computation the partial search paths about what has happened to us in
our past…we can then backtrack better in the network of optimal routes toward our individual goals in a
shared context of the both greatest and highest good for all of us. Something of no small importance if
we are trying to make some small steps to put religion and science back on friendly terms again, after
they were separated 1000 years ago by Saint Thomas Aquinas. A fourth, perhaps more brilliant, answer
to the problems posed by the counterexamples and paradoxes was figured out by one or our last
centuries greatest thinkers, Dr. John Von Neumann. As a 1st LT working for the US Army Reserves in the
1970’s at the Ballistic Research Laboratories in Aberdeen MD I had the opportunity to sit at the desk
where he and some people from the U. of Pennsylvania assembled the first ENIAC computer (which
was later moved to the Smithsonian Musuem of Natural Science on the north side of the Washington DC
mall). Looking back on it 35 years later, I am sure I didn’t have hardly any impact on the place compared
to that from which he did. But, it was fascinating experience and increased my interest in computer
science and this area of mathematics. Later I wrote a few papers for the US Army RD Science
Computing meetings on how various approaches to network optimization and search are related to this
and to the logical foundations of set theory. This less technical and more philosophical paper took me
much longer to finish up. As I mentioned earlier, I started a first version of it around that time. Dr. Von
Neumann’s idea, the fourth approach mentioned which claims to show us how to come up which a
group of axioms for mathematical logic and set theory was to: instead of restricting the existence of
sets (inside of the domain of collections or classes of objects) was to restrict the type of entities which
could be element of those sets. In logic both objects and functions can be thought of as elements to be
computed with through recursive or self-referential thought. If the objects and functions are ordered in
a sense that there is also a smallest element in the networked lattice created by all the possibilities then
the computation becomes more well-behaved. This in a nutshell is his approach. Dr. Alan Turing’s
foundational papers in which he first clearly defined how we can reduce what computation is to its most
basic functions and elements (Turing machines) built on Dr. Von Neumann’s theory of ordinal numbers
that arises from this approach to how to create a consistent set of axioms for set theory.

Add/Reply to this discussion board posting