a short history of computation

this one's for our parents

Alphabet_infinityq_-08.png
Alphabet_infinityq_-08.png
Alphabet_infinityq_-08.png
Alphabet_infinityq_-08.png
Coding Station

01.

the realization of computation

All computation (whether digital, analog, etc.) is based on analogy, or a systematic relationship, between the states of a computing device and those in the primary system.

 

Any computing paradigm, whether analog, digital or quantum, is at its heart a mathematical study. The systematic relationship between the computing physical system and the primary one has led infinityQ to develop a new computational approach, called quantum analog computing

02.

analog: representing physical systems numerically

Analog computers have a long history prior to the digital age and have been applied to an extensive variety of fields. Analog computation refers to an analogy, or systematic relationship, between the physical processes in the computing device and those in the system it is modelling/describing. An analog computer is therefore an analogy of the particular system it is set to describe. The physical quantities of the analog device follow the same physics laws as the mathematical properties of the system under study.

In analog computers, rather than operating through the manipulation of numbers as digital computers do, numbers emerge as a result of measurements of physical parameters. Electronic components (physical devices) are used to sum, multiply, and integrate physical quantities. 

02.

analog system advantages

their ability to connect these components in a variety of ways (depending on the given physical system)

speed

inherent natural parallelism

it's small size

In principle, any physical process that can be described mathematically can be used as the underlying structure for analog computation.

infinityQube.png

03.

In a well-known lecture in 1982, Richard Feynman proposed a quantum machine, capable of simulating quantum physics. He posited that, since nature is not classical, to simulate natural phenomena one would need a device which operates on quantum mechanical principles. These computing devices would exploit quantum mechanical properties such as superposition and entanglement. The promise of quantum computing is the delivery of more efficient computation, especially for certain types of computationally intensive problems. 

GATE PARADIGM

The most widely known is the gate paradigm, which is analogous to the binary logic gates in classical computers. Quantum information is processed by means of quantum gates, implementing complex circuits to achieve practical functionalities with the promise of incredible advantages over classical computing. 

ADIABATIC

The computation starts from an initial Hamiltonian with an easy to construct ground state. The Hamiltonian is then gradually varied into a final Hamiltonian, whose ground state encodes the solution to the computational problem. It has been shown theoretically that it is as powerful as the circuit-based approach, but requires additional physical qubits.

CONTINUOUS VARIABLE (CV)

Quantum information is encoded in an infinite-dimensional bosonic mode (photons). It is similar computationally to the circuit model. A qubit is realized by means of a polarization state of a photon. A limitation of this architecture is the need to achieve interaction between photons for multi-qubit control, which requires strong optical nonlinearities.

the main categories of quantum computation

1010101010_edited.jpg

04.

the current limitation of classical computing

Today, the main computational device is the digital computer, which has become omnipresent in our daily routines. For decades, increased computational power and energy efficiency has been obtained by shrinking transistors and putting more of them in microchips. Yet, by the mid-2000s transistor miniaturization had led to such narrow gaps between them that current leakage occurred, reducing energy efficiency gains and increasing the risk of overheating due to the presence of a natural upper bound on how rapidly heat waste can be removed.

 

A workaround to mitigate those limitations has been the development of many-core architectures and vectorization practices, although they are hindered in terms of Amdahl’s law. Even specialized architectures, such as GPUs, are limited by clock cycles due to parasitic capacitances, as well as certain energy barriers. The consequent slowing of Moore’s law has prompted the need to develop new computing technology.