Biology,  Computing,  Mathematics,  Physics,  Research

Areas of Semantic Research

There are many path-breaking areas of research at the nexus of meaning and matter. I am particularly interested in the following areas, with the specifics described below.

Quantum Physics

The problems of quantum physics are legend, but the solution to these problems is still not known. My angle to research in quantum theory is that quanta are not things, but symbols. Symbols have semantic properties that make them behave like concepts, rather than things. The paradoxes of quantum theory vis-a-vis classical physics are problems in understanding how concepts behave differently than classical particles.

I believe that current quantum theory requires a new mathematical formulation that derives the physical aspects of quanta from their semantic aspects. This new semantic theory will come from attempts to incorporate meanings within mathematics.

Number Theory

There is no area in science where the problems of meaning can be stated more succinctly and demonstrated more clearly than mathematics. The history of mathematical development is replete with paradoxes such as Godel’s Incompleteness, the Burali-Forti Paradox, Zeno’s Paradox, Tarski’s Undecidability, etc. These paradoxes arise through a curious mingling of everyday concepts and numbers. Mathematics uses many notions about numbers, but is unable to distinguish between them.

I believe that mathematics requires a new theory of numbers where numbers are known as types rather than quantities (types in everyday language denote concepts rather than quantities; numbers should also be treated as types – i.e. concepts). Attempts to include types in mathematics have led to paradoxes because types are derived from quantities, based on the idea that the world is primarily objects from which we construct meanings. The inverse proposition – that the world is meanings from which objects are created – is free from paradoxes, and represents a different foundation for mathematics.

The Nature Of Chemical Law

The central dogma in modern chemistry is that it can be reduced to the study of atoms. However, in quantum theory, the atoms are not fixed entities. Rather, their electrons can float from one atom to another, and what we commonly called ‘atomic bonds’ are replaced by ‘molecular orbitals’. In short, we can no longer treat a molecule as an aggregation of individual atoms; we must rather treat the molecule as an ensemble of particles with some total energy which can be divided in many ways, producing different structures.

How molecules change their forms, redistributing the energy and matter within the ensemble, is an area of indeterminism in science. The indeterminism is caused by the fact that quantum theory deals with the total energy in a system, but not with the distributions of this energy. The redistribution of energy is called the measurement problem, where, for instance, by changing the number of slits in the slit experiment we can reorder the quantum states. What corresponds to the slits in the case of molecules, where we do not have an observer? Unless this problem is solved, the structure of a complex molecule will remain a mystery, and chemical reactions will remain non-predictive.

The first step toward the solution of this problem is recognizing what the different molecular structures mean. This is possible if we treat the atoms themselves as symbols of meaning, such that the molecule becomes a complex meaningful proposition. Once these structures have been given meaning, then a chemical reaction would correspond to the transformation of meaning governed by the laws of meaning change. There can be many causes of these meaning changes, including observer interference, but the structures of molecules are like sentences that connect the words (atoms) into a grammatical structure. If physics is about the study of atoms or the words in language, then chemistry is about how grammar structures these words into meaningful sentences. The laws of chemistry are like the laws of grammar. You can use the same set of words to construct many different sentences, but which sentence is produced is governed by the meaning.

Biological Information

The central problem in biology is transcription and replication of genes through which heredity is produced. The transcription of genes is the classic example of how abstract information encoded in the DNA is expanded into contingent information in the proteins. Similarly, the replication of genes is a classic example of how the meaning in the mind is copied in the process of vocal expression. The person speaking the meaning doesn’t lose the mental meaning; rather, he or she creates a copy of that meaning and externalizes it.

Ordinary chemical molecules like water and salt do not replicate. What causes the DNA to replicate, and what type of phenomena does replication denote? Additionally, what properties in molecules make them descriptions of other molecules–such as the protiens in case of DNA? These are questions best answered by understanding biology in terms of meaning. Like the table of contents represents the book in a summarized form, similarly, the DNA represents the protein structures in a summarized form. The book expands from the table of contents, and although both the table of contents and the pages of the chapter are physical things, one is abstract and the other is contingent. The DNA is therefore the encoding of abstract information while the protiens are representations of contingent meanings.

I believe that DNA replication represents a new class of phenomena in which one molecule summarizes many molecules. Biologists freely talk about information in biology in the sense that the DNA encodes meaning, but the representational properties of meaning have no explanation. DNA replication and transcription correspond to a new type of property in matter that is best described as meaning.

Semantic Computing

The failure of artificial intelligence (AI) to elicit intelligent tasks such as language comprehension (leave alone creativity, problem solving and metaphorical thinking) has led most scientists to recognize that computers cannot deal with meaning. Why? Because a token in the computer has physical properties but no meaning; the programmer supplies that meaning during designing, writing and testing programs. In what way is the human mind able to design, write and test programs that a computer cannot? How does a human mind fix problems in a computer program when Turing’s Halting problem proves that it is impossible to write a program that will detect the program’s problems?

Advances in semantic computing depend on advances in mathematics and physics where a symbol can denote types. Computing theory has an important role in delineating the types for different classes of problems. In other words, semantic computing is about finding the elementary symbols and operations (i.e. language) using which any arbitrary semantic program can be written, compiled, and executed.