#002 | Language, Form, and Automation: Panini to Leibniz

In this episode, we explore some of the earliest theoretical ideas of the discipline that would eventually become Computer Science. Our journey takes us as far back to ancient India, leading us through the medieval Middle East and into the Enlightenment in Europe.

Episode Transcription

Hi, This is Wes Doyle. Welcome back to the Bitwise podcast, where we explore the history of computer science, the discipline of software development, and the implications of an increasingly digital world. In the last episode, we examined some of the early tools used for computation, which took us on a journey through ancient parts of Africa, Europe and East Asia. In this episode, we'll travel once again to many of those regions. This time, however, will examine history through the lens of philosophy and mathematics to get a broad sense of the intellectual underpinnings of computer science.

It's difficult to choose where to start here, since a theory of computation isn't really in concrete until the 20th century, when we start talking about the work of Kurt Gödel, Alan Turing, Alonzo Church, and others. So, for now, we'll look at some of the formative ideas about formal language, numbers, and systems of logic that formed the intellectual basis for what ultimately became computer science. There are several interesting segues between science and engineering in this history, especially when we start to look at some of the 17th and 18th century figures like Pascal and Leibniz, who contributed in some sense to both the engineering and the philosophical traditions. We'll work our way up to those individuals, which will lead pretty nicely into an examination of the work of Charles Babbage and Ada Lovelace in the next episode. But first, let's go way back once again to ancient India to explore a scholar named Panini.

It seems that there is some degree of uncertainty in the chronology of ancient Hindu history. Therefore, it's difficult to know exactly when Panini lived, but it's most likely he would sometime between the 4th and 6th century BCE. I love it when I stumble upon a figure like Panini, someone whose work has now rippled throughout many different areas of human thought, and who had to profound impact on linguistic history in the history of scholarship. He's someone who perhaps isn't as well known in the West, at least considering his monumental reputation. He is sometimes called the father of linguistics. He's famous for having written a text called Astadhyayi, which is a collection of 3959 rules of the grammar of the Sanskrit language, his treaties on languages so thorough and technical that he essentially developed the world's very first formal system - basically a coherent mathematical model for language. He even used concepts of generative grammar and recursion in demonstrating that new sentences can be constructed from other sentences according to sets of rules. What's remarkable about this is that it really wouldn't be until the 19th century that a more significant work on formal systems of logic would be picked up again. My people like Gottlob Frege and George Boole, both of whom we'll talk about in a future episode.

If we jump ahead a century or two - between the 3rd and 2nd century BCE, (again hear the timeline is kind of uncertain) - we encounter another Indian scholar by the name of Pingala, who presented the first known example of a binary number system in a series of chapters called the Chandasastra. Pingala's text is focused on studying the form and specifically the meter of Sanskrit poetry. In order to try to provide some context here for his work, it's worth mentioning that much of ancient Sanskrit literature is poetry found in the Vedas, for instance, and by famous poets like Kalidasa. This study of form may have developed in part as a way to preserve the shape of the poetry as it was passed down over time. In his binary system, he innovates the use of two symbols for long and short syllables, representing each as either light - laghu - or heavy - guru - rather than the way we might today using zero and one. His system is actually in some ways analogous to Morse code notation in that we have long and short atomic units to work with. His work has significance to combinatorics, computer science, and the study of Sanskrit and the Vedic tradition, as well.

If we were to jump ahead about a hundred years, we encounter another Indian mathematician and astronomer called Brahmagupta. His contributions to the development of algebra, geometry and astronomy were significant, but it's really his description of the use of zero that I find most fascinating. As far as we know, he's the first to write rules about the use of zero and negative numbers in calculation. In fact, his text Brahmasphutasiddhanta. It's the first to treat zero as a true number rather than just a placeholder or a symbol to represent the lack of some quantity as we find it used by the ancient Babylonians and the Romans. So we find a rule for multiplying any number by zero results in zero and a rule for multiplying negative numbers to get a positive number, multiplying positive and negative numbers, resulting in a negative number.

How often do you find the word algorithm used in the context of computer science? If you spent any time as a software developer during your career than you've probably spent a few long nights and weekends practicing algorithms to prepare for an interview or just to get a feel for a new programming language. If you studied computer science at any point in school than you may have taken courses on algorithms and algorithm design. The word algorithm is derived from the word algorism, which is a technique for performing arithmetic by writing numbers in place value and applying a set of rules to those digits. The word algorism is, in turn, derived from the name of a Persian mathematician born in modern day Uzbekistan named al-Khwarizmi. His name was later Latinized to Algorithmi. Like many of the other scholars in the thread we're currently following, me made contributions to mathematics and astronomy. However, it's due to his work with base-10 integer arithmetic that the Hindu Arabic numerals actually spread throughout the Middle East in Europe.

In his elaborately titled work, the Compendious Book on Calculation by Completion and Balancing, from the 820, we find rule based methods for solving linear and quadratic equations by means of reduction and equation balancing. In fact, one of the operations described in that text is called al-jabr, which, as you guessed, is where we get the term algebra. The al-jabr operation described adding some value to both sides of an equation like you might learn in any algebra class is a student today.

So, the bed of knowledge upon which computer science is founded is vast. It's a little fuzzy. It spans a long period of time and geography. There are many individuals here that I haven't mentioned, but there are a few from around the same time period that definitely made some contribution to the way that we think about concepts in mathematics that relate to computer science.

al-Kindi was an example in the 9th century, from a region that is now in modern day Iraq. He is known for having discovered frequency analysis, which basically consists of counting the frequency that letter's occur in a language. This can, of course, be used to solve basic substitution ciphers. There's also the Egyptian al-Qalqashandi who did work on cryptology and ciphers in the context of government.

I also want to take a moment to mention Ramon Llull from the kingdom of Majorca in modern-day Palma, Spain, from the 14th century. He's sometimes regarded as the founding father of information science. He may be particularly famous because his work had an influence upon the mathematical giant Gottfried Leibniz. Leibniz actually wrote his dissertation about Llull, who was a prolific writer and has an interesting biography. In short, as far as we're concerned with this history, he is in some ways one of the first individuals to try and make logical deduction in a mechanical fashion by some set of rules.

He believed that human thought could be mechanistically described and that given some set of agreed upon initial truths in any field of knowledge, all other knowledge could be understood by combining those truths in different ways. His channel for this was essentially an attempt to deducing certain religious beliefs, and, converting people to certain religious beliefs. But his idea for applying a mechanistic process to deduction is worth noting, as is his work in setting up systems of classifications.

Later, as we move into the seventeenth century, one of the fathers of modern political philosophy, Thomas Hobbes, comes into being. While Hobbes’ work doesn't relate directly to the current thread we're on here, he did write a very famous work you may have heard ever studied at some point - The Leviathan - which was published in the year that the English Civil war ended, or a series of civil wars, in 1651, after nearly a decade of bloodshed across England. The war profoundly affected Hobbes, his thought and his philosophy.

If you're not familiar with the Leviathan, it is a book that's essentially a formative work on developing a theory of the social contract. Hobbes makes the case that as individuals, we give up some of our freedoms, agreeing to a sort of contract or covenant with others - forming the state - to abide by a system of laws and governance so that we might live more fulfilled lives and establish culture, which he actually felt would be impossible in our sort of natural state. In any case, there's a famous line from the Leviathan where Hobbes writes:

By ratiocination, I mean, computation.

He explains this as essentially communicating two ideas. The first is that thinking is essentially composed of symbolic operations, just like we might use symbols in the form of speech or when we write, only internal to the individual. He calls these units of thought parcels and talks about in some way how we can add parcels together two to form thoughts.

The second idea here is that thinking is rational when it's treated as computation. That is, when it's done by some set of exact rules. It's interesting to note that Hobbes was actually a fan of Galileo and traveled to meet him. It's probably the case that Galileo's rigorous and sort of mechanical approach to physics was some influence on Hobbes, who believed that nature is, in some sense, inherently mathematical and physical.

Around the same period, in the mid-17th century, we find Blaise Pascal, a French scientist, inventor and later writer and philosopher. Pascal probably could have been mentioned in the last episode when we kind of focused on some of the tools for a calculation for his 1642 machine, called the Pascaline.

He started this when he was just nineteen years old and conceived of it and built it to help his father, who worked as a tax collector. The device was kind of complicated and prone to mechanical error, but it could actually add and subtract numbers with up to eight digits. You can actually find pictures of them online now. He sold a handful of them over the years. They didn't take off because they were kind of fragile and costly to manufacture.

I don't want to gloss over Pascal so quickly, here. He is an important figure in classical French literature in the development of probability theory and economics. But in terms of computer engineering, his Pascaline really does represent probably the most complete and functional forerunner to the calculator. You also might be aware that there is a programming language from the late sixties named after Pascal, designed by the Swiss computer scientist Niklaus Emil Wirth. Here's someone hopefully we can talk about in a future podcast.

German mathematician and philosopher Gottfried Wilhelm von Leibniz, who also conceived of the ideas of the differential and integral calculus independently of Isaac Newton, became aware of Pascal's adding and subtracting machine.

Whenever I read about Leibniz, I just picture this guy who is - maybe just pacing through his study - just having read absolutely everything and having heard about all the most recent inventions and ideas and just just jumping on top of them, you know, taking them further than they currently existed. He just seems like he must have been so restless, given the magnitude of his contribution to so many areas of math and science and philosophy.

In any case, upon becoming aware of Pascal's, adding in subtracting machine, he made a substantial improvement upon the design by implementing a mechanism to perform multiplication as essentially just as a sequence of those addition operations. This was his so called Stepped Reckoner, and it was actually the first device built that could perform all four arithmetic operations. As I mentioned, Leibniz is sort of one of those immortal figures in mathematics and philosophy, having made significant contributions to many modern areas of intellectual study. In terms of computer science, Leibniz also refined the use of the binary number system, perhaps drawing upon work by the English philosopher Francis Bacon or the Scottish philosopher and mathematician John Napier, who had the amusing nickname Marvelous Merchiston.

Leibniz was keen on comparing his own theistic and metaphysical views to those of other cultures and was notably fonder of Chinese culture. He is said to have compared an illustration of the yin and yang from a copy of I Ching that he had to binary one and zero - as a mathematical concept, and in some sense a theistic concept.

Another monumental idea that Leibniz worked on was his calculus racionator, which was sort of a logical calculation framework.

Okay, so that wraps up this episode. I would like to say that there is so much that can be said about all of these thinkers that we've covered so far in the podcast. There is obviously no way that I can really do justice to the contributions of each of these individuals over the course of fifteen or twenty minutes in a podcast. At the very least, I hope that this format can serve to provide a high-level view of the myriad contributions that many individuals have made over time to form the foundation of what has become computer science. There is, of course, a great deal of scholarship about each of these individuals that is available, and I would definitely encourage you to read more about them if any of this interests you.

In the next episode will explore the work of Charles Babbage, Ada Lovelace, and take a look at really the first concept of a programmable general purpose computer. We'll also talk about the work of George Boole and Gottlob Frege as relates to the formation of logic and Boolean algebra. So I hope you stay tuned, and I hope that you are enjoying the podcast so far.

Thanks for listening. If you're looking for a way to support the podcast, you can do that over at that bitwisepodcast.com - just click on the menu and click support. Thanks for listening.