Evolutionary Computation
There is a portuguese version of this post.
Evolutionary computing (EC) has been widely used in recent years, and every year there are new applications for the techniques developed, despite being a relatively new area, few people are giving attention to that (at least in my vision and I will explain why) probably will have a promising and revolutionary future in relation to how we can generate innovation and even learn from it, especially as it relates to Evolutionary Algorithms (EAs).
In March 2003, a paper was published in Scientific American, with authorship of Koza, John R. and other authors. The title of this paper (which I recommend for those who do not have the slightest idea of what I’m talking about) was: “Improving inventions” and this was my first contact with the EC and what left me amazed.
Anyway, as I was saying, this paper spoke on the use of Genetic Programming (GP) in the field of electronics; an example was the creation of low-pass filters that allow the passage of low frequency signals and attenuate high frequency signals. I do not know if this was the first time this application was used for the GP, but what I found quite interesting in this paper, was not the application in the field of electronics, but what I was surprised is that the inventions created using GP were highly competitive with the invention created by humans, the author quoted in the article that the GP has already played 15 patented inventions, some of the inventions were new creations producing the same functionality of their previous creations while others showed improvements.
Today I know that there are about 36 (or even much more) inventions that compete with human inventions, from these 36 inventions, 15 infringe or repeated the same functionality of patented inventions in the twentieth century and 6 of them also infringe or duplicate the functionality of the previously patented inventions in the XXI century, while 2 of 36 inventions were patented as new inventions.
But what this really means? It only means that we have a few more inventions? Obviously not, and here is where is the lack of attention given to the EC, most simply see only the result of innovations, the final product, and fail to see what really deserves attention: the way they were created.
To understand better the importance of this, you must know what Genetic Programming needs to create these inventions.
What we need in Genetic Programming, in short is: a structure representing a solution of the problem in the case of low-pass filter, a structure that represents the positions, connections and components of electronic circuitry, we need a way to be able recombining two of these structures, a way of mutate each of these structures and a function that when applied on this structure, tell us how good it is, a quantitative measure for the low-pass filter, this could be the efficiency to leave only low frequency signals pass through the filter.
Now we can understand what is happening with the EC, we arrived at a given moment that we can create new inventions in an automated fashion method, or how said the author of the article, John Koza, we’ve created a machine to create inventions.
Many of these inventions created using GP are difficult to understand, but we know that work, how they are created is as if we are saying to the computer: “I need a circuit that blocks low frequency signals”, and this machine creates this invention, this circuit.
Now imagine the following situation: we had not yet invented the low-pass filter, and then employ a GP to create one, the result would be a low-pass filter completely unknown, which we could learn from this invention created by a computer. The GP doesn’t created just a single invention, it creates something that we can use to learn, it says much more than a mere invention.
Recently, an article was published in Science magazine under the title: “Distilling Free-Form Natural Laws from Experimental Data”, which shows the result of a new method designed to perform symbolic regression through a set of raw data from sensors; the innovative technique uses partial derivatives of each pair of variable to perform comparisons with individuals generated using an Evolutionary Algorithm similar to the Genetic Programming (GP). Without any prior knowledge about physics, kinematics or geometry, the algorithm discovered Hamiltonians, Lagrangians, and other laws of geometric and momentum conservation (for those who have more interest, visit the university site).
This advance represents an important step in the symbolic regression using raw and empirical data collected through sensors. But what does this really mean?
Let us define what is a mathematical equation: a mathematical equation, or more precisely, a mathematical function, says the relationship between the dependent and independent variables of a domain. In physics, for example, this function can represent the mathematical knowledge of a natural law such as Newton’s laws for example. That is, a function may well represent a small part of our knowledge on the world around us.
When performing the symbolic regression (to discover the function that represents the relationship between variables), we not found just another mathematical equation, but we found a natural law and from this, we can predict accurately what is the behavior of a pendulum for example, or how the planets are increasingly moving away from each other in relation to its distance.
Like I said before, what we should pay attention, is in how these laws are discovered, in the case of research published in Science, this discovery has been automated, or did not need the knowledge of physics or geometry to find the function that dictates the relationship between the movement of some bodies, it needed only the empirical raw data collected by sensors. What really is important here is that we are generating scientific knowledge capable of rigorous testing in an automated manner, without the need for prior knowledge of geometry or physics.
Philosophical thinking, from the assumption that the science of nature, physics, contains principles as a priori synthetic judgments, they are clearly generating synthetic judgments a priori in an automated manner, or can discover geometric relationships without prior knowledge of geometry. A practical example, we could discover the theorem of Pythagoras or other relationships using raw measurements of triangles without the prior knowledge of the theorem. I currently have no time, but I will try to perform the symbolic regression of the classic theorem of Pythagoras using images of triangles.
Finally, there are many experiments that we can build using AEs, and it is only a matter of time for new and important scientific discoveries being made.
What else can we discover using Evolutionary Computation ?
Great post! Just wanted to let you know you have a new subscriber- me!