Martin BartelsIn abstract terms, we can describe a computer as an object designed by humans and functioning outside the human body, which performs calculations based on deterministic processes.
The origins of computers can be traced back to the ambition of deeply thoughtful individuals who strove to make the universe more understandable. However, these early inventions were not intended for everyday practical purposes, which is why early computers did not become widespread in society.
In spring 1900, a chance discovery by sponge divers on the seabed of the Aegean Sea revealed that the ancient Greeks had already developed computers before the start of the Common Era. It then took another century to fully comprehend the preserved parts. Finally, with the help of DeepMind's Ithaca programme, the engraved instructions for use have recently become comprehensible, enabling the device to be fully reconstructed. The resurrected version of this mechanical computer, now known as the 'Antikythera', functions reliably. The device is capable of performing precise astronomical calculations and determining the dates of the Olympic Games.
Much later, the mechanical astrolabe, developed by Arab scientists, was used to measure the height of celestial bodies above the horizon and as such became a significant technology in the field of determining the positions of ships.
Leonardo da Vinci designed a calculating machine, but it is unclear whether his sketches of this machine ever led to the construction of an actual physical device.
A century or so after da Vinci, the calculating machines developed by Schickard (1623) and Leibniz (1727) probably reached the limits of what can be achieved by purely mechanical devices.
The binary number system, which is essential for modern computers, dates back to the 17th century. It was not until the late 1930s that this system was applied for the first time to Konrad Zuse’s motor-driven, freely programmable mechanical computer Z1.
The subsequent groundbreaking successes of electromechanical and electronic computers are familiar to us all. There has been a steady acceleration in developments in computer technology, with ever-increasing computing power and, due to accelerating technological innovation and a surge in demand, falling prices. Within this new field, each new technology advance is marked as a new product series, i.e. through carefully planned partial or complete replacement of hardware and software. Prior to release, new technologies are put through their paces to make sure they are free of any problems and that any issues can be fixed.
New systems that users perceive as inferior or less effective than those of competitors fall victim to what in evolutionary terms, we would call negative selection. In other words, no one buys them and the product drifts out of the market. So, we are dealing with a rapid and reasonably well-functioning evolutionary process. The overpowering driving force behind this is the commercial value of new technologies and fierce competition between rival companies.
The field of 'photonic computing' is evolving quickly and is set to increase the performance of systems by relying on extremely fast light signals, which consume very little electrical energy.
Intensive scientific research labelled ‘neuromorphic computing‘ is underway, exploring the creation of powerful computer systems that emulate the biological nervous systems of brains. Attempts are already being made to simulate the functioning of the human brain, though these are likely limited to the rational part of it, i.e. the prefrontal cortex.
Once quantum computers are ready for the market, we can expect to see an additional increase in the performance of AI models.
We cannot know for sure which developments will reach the stage of broad practical application, and when this will happen. Nor can we anticipate which additional approaches will broaden the spectrum. However, there is no reason to expect progress to slow down.
The timeline of the evolution of computers is extremely short compared to that of humans, who took millions of years to reach the stage of Homo sapiens around 300,000 years ago.
We always tended to regard the human species as the crown of creation. In the Age of Enlightenment, there were debates about whether nature, including the human body, was an expression of divine will and, as such, should be considered ideal as it was, with no chance of improvement. The idea of the 'divine watchmaker', whose intricate natural designs are as precise as clockwork, still flares up today, even in academic circles. However, the assertion that creation is perfect is nothing but a postulate. This overconfident thinking is a dead end.
Clearly some results of our evolution are suboptimal. Evolution has not eradicated many anatomical and physiological flaws. For instance, the human spine is not designed for walking upright, and the pelvis is too narrow for childbirth. The list of our physical shortcomings is long. Evolution did not lead us to bodies that would be entirely suitable for the lives we want to live.
Instead of replacing parts that no longer meet humans' changing requirements, nature has preserved and adapted them to some degree. We have modern medical science to thank for compensating for many disadvantages of our suboptimal bodies.
If there were engineers who would be tasked with designing a human body that would optimally meet the demands of modern civilisation, they would arrive at results that would save us from a lot of trouble and pain, but they would not fit in with our ideals of beauty or our habits, to put it mildly.
Our thought processing system contains both old and new components that function in different ways, and which sometimes work against each other.
The prefrontal cortex is the 'modern' part of the human brain, responsible for logical thinking and weighing up the short- and long-term consequences of actions.
In contrast, our very old reptilian brain harbours survival instincts, such as defending one's territory and asserting dominance.
Our limbic system steers “cognition, including spatial memory, learning, motivation, emotional processing, and social processing.“
Conflicts flaring up between age-old systems and new systems in our brains provide material for crime novels and dramatic news reports. They help to explain much of the rudeness that we either experience from or inflict on others, individually and collectively. Traditional cultural systems that have grown out of the old components of the brain may act to mitigate conflict but can also feed it.
Such anomalies may be exacerbated by the outstanding human ability to retrospectively justify irrational, and even cruel, actions in such a way that they appear to others as a logical or even ethical necessity.
Although the evolution of organic beings allows for useful mutations that promote further development, selection primarily occurs in response to errors that threaten survival and reproduction. While human evolution is a constant, the changes it brings are, from our perspective, conservative due to a relative lack of pressure. Thus, compared to computers, who exist in a competitive marketplace where the threat of extinction is ever present, our own evolutionary leaps occur at glacial timeframes. There are no signs of exponential growth in human thinking capacity. Moore's Law does not apply here.
At this point, it might be tempting to delve into the supposed solutions of genetic engineering and brain implants. However, this would take us far beyond the scope of this article.
We should accept that biological evolution cannot keep pace with the giant leaps that computers are making. But should we keep the digital tiger in a cage?
The process of continuing development of computers is still very much in human hands. We can design computing processes so that humans make all critical decisions, without which nothing can proceed. This should even be possible with self-modifying or self-evolving processes.
The task is to ensure that, in the long term, we maintain a firm position of superiority over thinking machines, even though our brains' processes are slower and sometimes short-circuit.
Over 50 years ago, the epic film 2001: A Space Odyssey – which spans over four million years – captured the essence of our theme, depicting astronaut Bowman's battle against the seemingly perfect computer HAL 9000. Even today, the significance of Bowman's enigmatic victory remains unclear. What was once a staple of science fiction—humans battling against seemingly sentient computers—is now a reality.
Share your thoughts and get in touch with the author
No comments yet. Be the first to share your thoughts!