Leibniz - a Digital Scientific Notation

A scientific notation is a convention for encoding scientific information using symbols. The best known example is mathematical notation. The goal of a scientific notation is to represent scientific knowledge in a way that humans can easily comprehend and manipulate. While in principle a mathematical equation could be replaced by an equivalent statement in plain language, the more concise equation is faster to read (assuming a trained reader) and allows manipulation by formal rules (such as "add the same term to both sides").

Good notations matter

In his book "An Introduction to Mathematics", Alfred North Whitehead writes (Chapter V, “The Symbolism of Mathematics”):

"By relieving the brain of all unnecessary work, a good notation sets it free to concentrate on more advanced problems, and in effect increases the mental power of the race. Before the introduction of the Arabic notation, multiplication was difficult, and the division even of integers called into play the highest mathematical faculties. Probably nothing in the modern world would have more astonished a Greek mathematician than to learn that, under the influence of compulsory education, a large proportion of the population of Western Europe could perform the operation of division for the largest numbers. This fact would have seemed to him a sheer impossibility."

Notations for the digital age

A digital scientific notation is a scientific notation that can be processed by both humans and computers. A machine readable notation is necessarily a formal language and thus has a well-defined unambiguous syntax in addition to some useful level of well-defined semantics.

There are many formal languages designed for representing scientific information. An example is the Systems Biology Markup Language (SBML). Most of them do not qualify as digital scientific notations, because they are designed to be used by software but not for communication between humans.

There are also formal languages that are designed to be read and written by humans, in addition to computers. Programming languages are the most prominent examples. In scientific computing, programming languages are routinely used to represent scientific knowledge as program code. In particular, computational notebooks embed code written in high-level programming languages such as Python or R into a narrative, much like mathematical notation is used in traditional scientific publications. However, programming languages fill the role of scientific notations rather poorly. In particular, they cannot express anything else than complete executable algorithms that include many technical details that are irrelevant to the science behind the code.

Summarizing Leibniz in a catchy phrase, it bridges the gap between mathematical notation and programming languages.

Leibniz is named after Gottfried Wilhelm Leibniz, who made important contributions to science, mathematics, formal logic, and computation, topics that are all relevant to this project. He invented a widely used notation for calculus, laid the foundation of equational logic by hisdefinition of equality, and anticipated formal logic with his "calculus ratiocinator".

What Leibniz is not

Two common misconceptions deserve an explanation:

Leibniz is not a programming language

Leibniz is not a computer algebra system

Current status

Leibniz is currently a research prototype. It has a solid foundation (order-sorted term algbras, rewriting, equational logic), but the use of this foundation in the design of a notation that is suitable for communication between humans is still under exploration. Moreover, the implementation is in many aspects reserved for insider use. In particular, error checking is very incomplete and focuses on detecting typos rather than on guiding users in the formulation of correct and clear code. And while personally I don't mind dropping into a Pharo debugger when something goes wrong, I am aware that this is not an acceptable behavior for an authoring system meant for productive use in research.

Background reading

Scientific notations for the digital era ( arXiv preprint , 2016)

Scientific communication in the digital age ( Physics Today , 2016)

Verifiability in computer-aided research: the role of digital scientific notations at the human-computer interface ( PeerJ Computer Science , 2018)

Computational science: shifting the focus from tools to models (F1000Research, 2014)