The beginnings of the abacus are thought to be of Sumerian origin and date back over 4000 years. Through trade, the concepts were spread, varied regionally, and are part of the UNESCO immaterial cultural heritage.
Famous physicist Feynman was known for his ability to perform written calculations extraordinarily fast and was challenged to a calculation contest by a Japanese abacus expert. While the abacus proved to be the faster method for addition, Feynman’s written calculations came out ahead for more complex problems.
The mechanical calculating machines of the 17th century were, by the standards of the time, extremely demanding in terms of precision engineering, which in some cases prevented reliable operation and wider distribution
The Pascaline was constructed by Blaise Pascal in 1642 and was long considered the first mechanical adding machine until references to Schickard’s calculating machine surfaced in the 20th century. This machine (built in 1623) could already add and subtract. The design was lost for a long time, but drawings of the machine were found in the estate of Johannes Kepler, and a working replica was made in 1960.
The machine designed by Gottfried Wilhelm Leibniz and based on the staggered-roller principle, could perform addition, subtraction, and multiplication. Parts were demonstrated to the Royal Society in London in 1673, but the intricate precision mechanics probably prevented a complete and error-free working model during Leibniz’s lifetime — it was not until 1990 that such an error-free working replica was built. Leibniz is also of great importance for modern information processing, for example through his investigations of binary numbers and the combination of arithmetic and logical problems by means of binary numbers.
Babbage: Analytical Engine
The original machine would have been a true masterpiece of engineering, with over 50000 parts, powered by a steam engine and the size of a small factory floor. The programming and supply of data via punched cards was inspired by automated and steam-driven Jacquard looms emerging at the time. Data output also took place via punched cards. The programming language resembled today’s assembly languages and would have made the analytical engine the first Turing-powered machine, since it allowed loops and conditional execution. Due to the enormous complexity, only individual components were built and the machine later fell into oblivion.
Ada Lovelace translated a description of the machine by Frederico Luigi Menabrea from French into English, which, with Babbage’s encouragement, became a much more detailed treatise. She recognized the possibilities of the machine beyond mere calculations and considered processing letters or composing music. She wrote a description of how to calculate Bernoulli numbers and is considered the world’s first programmer, although Babbage himself also described how to control his machine by giving instructions. Women also often played an essential role in the design of later programming languages.
Boole laid the foundations of modern mathematical logic by consistently formalizing classical logic and propositional logic. Nowadays Boolean logic encompasses not only Boole’s original concept but also the additions developed by Ernst Schröder, Guiseppe Peano and Ivan Ivanovich Shegalkin, among others.
It allows for the elegant connection of logical statements, which are represented by known mathematical symbols, and with which one can calculate in usual form.
Around 1900, the foundations of quantum mechanics were laid — among others by Max Planck, Niels Bohr, Louis de Broglie, Albert Einstein, Werner Heisenberg, Paul Dirac and Erwin Schrödinger. These developments were honored with several Nobel Prizes. Central to this is their probabilistic view of the world — particles, for example, are not deterministically in one place but only with a certain probability. Likewise they can be at another place with a certain probability. Only a measurement determines the place, but also destroys the underlying quantum state.
An imaginary memory tape contains a series of characters, all of which must be taken from the so-called input alphabet. A machine converts these characters according to predetermined rules. These rules change in turn — the machine assumes other ’states’. At each step, it decides not only whether the characters are converted or not, but also whether a new state is assumed. The states and their changes thereby encode the computational operations — they can be applied to any input. Turing also developed specialized computing machines (the so-called ‘Turing bombs’) to decode German military ciphers during World War II.
Von Neumann Architecture
The Von Neumann architecture consists of different parts, which taken together implement all functions of the Turing machine. A memory unit, for example, contains the characters that are on the memory tape in the Turing machine. It also stores the programs, i.e. the instruction sets corresponding to the states in the Turing machine, and the rules according to which the states are changed. A control unit transmits these instructions to a computing unit that actually physically executes the operations. A bus system ensures communication among these elements and with the output plant.
Although the von Neumann architecture is merely an abstraction, it is much more application-oriented and many of its elements can be identified with the components of a modern PC: The memory plant corresponds to RAM and hard disk, for example, and the arithmetic plant to the processor
Zuse Z4, Eniac, Univac
The Zuse Z4, designed by Konrad Zuse in the 1940s, was the first calculating machine of this generation, although it was not entirely Turing-complete at the time — it could not put into practice all the functions that a Turing machine postulates. This was achieved by the ENIAC — a device based on electron tubes, diodes and relays, which was developed by the American army. The designers went into business for themselves some time later and launched the first commercial computer, the UNIVAC.
A transistor can control electrical currents and voltages applied to its inputs and outputs by voltages applied to other inputs. This gives the possibility to find physical equivalents to the 1 and 0 states of modern digital computers and to realize calculations by the interplay of transistors.
Electrically charged particles, such as ions, are subject to forces in electric and magnetic fields. It seems plausible o not only to set them in motion, but also to stop them — to capture them, so to speak. However, the laws of electrodynamics do not make it quite so simple, and it was not until 1953 that Wolfgang Paul succeeded in capturing ions by using high-frequency alternating electric fields — radio waves. Originally, Paul only wanted to determine the weights of different ions. However, his invention had resounding success in the field of atomic physics, where it had massive significance in the run-up to the development of laser cooling. From the 1970s onwards, it was increasingly possible to trap ions in these traps and thus to explore the laws of quantum optics, in particular by Hans-Georg Dehmelt, who received the Nobel Prize together with Paul for this innovation.
The very first inputs into computers were made with punch cards. Impractical, not user-friendly, and heavily dependent on the particular computer model, these systems were replaced starting in 1957 by programming languages. These areformalized languages that are easier for humans to parse and understand, and which are used to write programs that were then translated into machine language by a compiler and executed. IBM introduced Fortran for its computer systems in 1957. At the same time, Grace Hopper developed FLOW-MATIC, the precursor language to COBOL, for Remington Rand’s UNIVAC computer.
Then, as now, the solution to improving computers was well known: More circuits, more components. Conventional design required vast quantities of vacuum electron tubes, relays, transistors and other electronic components. However, Jack Kilby noticed that with the invention of the transistor, it had become possible to make all the necessary components from semiconductors. He succeeded in producing a quite complex circuit from one cast, with quite small components on a germanium chip. Although silicon quickly proved to be the superior material, Kilby nevertheless received the Nobel Prize in Physics in 2000.
By the end of the 1960s, computers were already widespread as large-scale devices. Several companies then began to develop operating systems to allow multiple users to use the computers simultaneously and to run programs concurrently. To prevent the latter from getting in each other’s way, it was necessary to organize resource usage. UNIX was developed by the famous Bell Labs for this purpose. Microsoft’s DOS operating system became the cornerstone of the company’s success a few years later. Today, Windows shares the market with MacOS and Android (on cell phones) — but the UNIX successor LINUX is also still widely used.
Based on the exploitation of discrete energy levels and transition wavelengths in atoms and the Doppler effect, two proposals were independently published in 1975 by Theodor Hänsch and Arthur L. Schawlow, and Dave J. Wineland and Hans G. Dehmelt, on how to cool free or trapped atoms with laser light. All four authors later received a Nobel Prize, but none for their contribution to laser cooling: Schawlow in 1981, Dehmelt in 1989, Hänsch in 2005, Wineland in 2012. Laser cooling allowed groundbreaking developments, such as the Bose Einstein condensation. Or the world’s most accurate atomic clocks.
Some Nobel laureates mentioned here not only helped shape the research that led to this company’s founding, but also the founders of eleQtron themselves: Prof. Christof Wunderlich received his PhD in the group of Theodor W. Hänsch and did postdoctoral research in the group of Serge Haroche before investigating new concepts for quantum computers with ions together with Peter E. Toschek and Werner Neuhauser. Dr. Michael Johanning did research as a PostDoc in the group of William D. Phillips.
Apple II, IBM PC
The Apple II, along with competing models from Commodore and Tandy, represents the first commercially successful personal computer. Its relatively widespread distribution enabled commercial software providers to find a mass market for the first time. The large number of available programs in turn increased its appeal, and four years later led to industry giant IBM also entering the PC market.
Observation of the first single ion
The combination of ion traps and laser cooling allowed the creation of Coulomb crystals in which ions arrange themselves in regular structures. Something similar had been observed before on charged dust particles. Thus, in 1980, Werner Neuhauser, Martin Hohenstatt and Peter Toschek were able to take a photo of a single ion, something that co-founders of quantum mechanics such as Heisenberg and Schrödinger had dismissed as infeasible only 20 years earlier.
By allowing easily controllable ions to interact with each other in the same way as electrons would in a solid, for example, it would be possible to recreate this solid, so to speak. This concept, then called the quantum simulator, is considered one of the origins of the idea of the quantum computer.
The Concept of Quantum Computing
Almost all real physical systems consist of a multitude of smaller particles whose interaction with each other determines the measurable properties of that system. To understand the systems, one could calculate the properties — but doing so without approximations is nearly impossible. For example, a system consisting of 30 particles, each of which can individually assume two states (qubit), has about a billion states — so describing it accurately is almost impossible. Feynman’s idea was to adjust well-controllable physical systems such that they behave similarly to the system of interest — and to simulate it in this way. The computer here would be the adjustable physical system that relies on quantum effects itself — a quantum simulator.
Like operating systems, modern network technologies owe their existence to the need of various computer users to share resources and, if possible, to control computers remotely. ARPANET was introduced as a network of various American universities over the telephone network in 1969. The WWW also has its origins in an academic field: to give universities around the world access to the high-energy physics data measured and stored at the CERN physics research center. There, it was developed between 1985 and 1989 by Tim Berners-Lee.
Since the proposal for a CNOT quantum gate with stored ions by Cirac and Zoller in 1995, several groups worldwide have been working on the implementation of this concept. Initially, focused highly stable laser sources were used to control individual ions and thus qubits. Their mutual repulsion allows the creation of entangled states: here, the state of both ions is random when measured by itself, but both ions are always found in the same state, for example.
The stability of laser sources at this level is an enormous technical challenge, and the realization of CNOT gates with ions by Blatt’s and Wineland’s groups in 2003 is testimony to the high level of control and stability that has been achieved in these groups. The control of small quantum systems was recognized with the Nobel Prize in 2012.
Other platforms have also been proposed and used to demonstrate quantum algorithms. One outstanding experiment was the efficient factorization of the number 15 by using a molecule in which the nuclear spins were used as quantum registers and controlled with radio frequency pulses (nuclear magnetic resonance — NMR). Despite these early successes, this platform was known to be non-scalable because ensemble averages are measured here, resulting in poor scaling of the signal-to-noise ratio as the register size increases.
MAGIC (MAgnetic Gradient Induced Coupling) allows control of qubits with static magnetic fields and scalable radio frequency fields as widely used in communications. The concept was invented by founder Christof Wunderlich and experimentally researched and further developed in his Siegen research group. The MAGIC concept combines the advantages of individually controllable ions with the scalability and control of microwave pulses from NMR.
Based on the early successes of NMR-based systems and the excellent quality in control and state measurement of single ions, the question arises whether the NMR approach is not applicable to ions. Indeed, the quantum states of ions can be controlled coherently, i.e. with full preservation of quantum information, by radiofrequency pulses. However, two physical facts stand in the way of this approach:
High-frequency fields cannot be focused on single, closely spaced ions because of their long wavelength.
The momentum of high-frequency photons is too small to excite oscillation of the ions, which is used to create entangled states.
In Prof. Christof Wunderlich’s concept, these two problems are solved by adding inhomogeneous magnetic fields. His group has been working on the demonstration and application of robust quantum gates with single ions since the very beginnings.
Nobel Prize for Serge Haroche and David Wineland
Both were honored for their groundbreaking research on the analysis and control of individual quantum systems, which they pursued consistently over decades. Wineland specialized primarily in the control of ions with laser light, while Haroche investigated the interaction between microwave photons and uncharged atoms.
What they both had in common was that they succeeded in observing fundamental quantum effects and controlling them almost at will. This includes, for example, an implementation of Schrödinger’s cat — Erwin Schrödinger’s famous thought experiment, which leads to a superposition (i.e. simultaneous existence) of two classically distinguishable states.
eleQtron is founded
The eleQtron GmbH was founded on May 12, 2020. Founders on the physical-technical side are Prof. Dr. Christof Wunderlich and PD. Dr. Michael Johanning, who have been intensively and successfully pursuing the quantum information approach with ions, magnetic fields and high-frequency pulses for years on both the conceptual and experimental side. The economic competence is contributed by Hill GmbH, represented by Prof. Dr. Martin Hill, Dr. Rainer Baumgart and the Siegerlandfonds, represented by Dr. Susanne Kolb.
First industrial applications
We are pursuing two major goals in parallel: on the one hand, to develop a scalable and freely programmable quantum computer, and on the other hand, to build a soon-to-be-available quantum computer specialized for specific solutions. By 2025, we will lay the foundations for the first goal. Beyond that, we will have completed a quantum computer suited for optimization problems.
Scalable Quantum Computer
MAGIC quantum computers are available for a large number of industrial and academic problems and will revolutionize both business and science.
Capabilities and Applications
Quantum Computers can solve mathematical problems which conventional computers cannot tackle.
The smallest computational unit of a Quantum Computer, just as a Bit is the smallest unit of a conentional computer. A qubit, however, can take the values 0 and 1 — at the same time.
The bits of a conventional computers take the values 1 or 0 — this is the basis for calculations. The Quantum Computer can take combinations (‘Superpositions’) of 1 and 0. This means that a parallel calculation on different inputs is possible. Several solutions can be tried out at the same time, so to speak.
There are working Quantum Computers with some dozens of qubits, and they can solve special test problems which would take classical computers much longer. Until, however, a freely programmable Quantum Computer is relevant for industrial applications, many problems will have to be solved. In particular the quality of the gate operations — the simplest steps of the calculation — has to be increased drastically.
No! Special problems of commercial and scientific relevance can be attacked by so-called NISQs (Noisy Intermediate-Scale Quantum Computers). These are computers with limited qubit number and gate fidelity. The development of NISQs with relevant calculation power is ongoing, and als permits the technology improvement which will be necessary for universal quantum computers.
Problems in basic research, like the simulation of large quantum systems; but also problems from finance, chemistry and logistics