Background

Quantum Leap A New Model of Computing

Selection of drone for your interest is very tough when it comes to various and vast drone domain.

Senior Software Engineer

Testimonial Author

Harshana Prasad

A computer model can be recognized as a system which has a way of receiving input(s) and giving confidential output(s) according to a given set of instructions. In the story of civilization, humans could come up with different creative models to sort out their complications which were confronted with their everyday activities. Abacus was such simple model made in 300 B.C. for essential calculations like addition and subtraction. As time passed, brilliant, visionary and fabulously talented people created much better intelligent machines of computation as they can operate much faster with highly accurate results.

Graph
Computer Miniaturization

Moore’s law states the miniaturization nature of computer hardware. According to this law, a number of transistors in a chip is nearly doubling every two years.

This problem was not a big problem among computer scientist in the early days of modern computing. But, within time, the size of elementary components in a machine get smaller and smaller. Ultimately, those components will be in microscale and classical computing cannot control the state of computing operations. Now, the most fundamental question is to find smaller elements which can perform computing in a much better way than we do now.



As an alternative, there are particles described by quantum physics which are so tiny and have their unbelievable behavior. Computing using quantum physics was first introduced in 1982 by Richard Feynman.

According to his explanation, the notion of quantum computing is that problems that take exponential time in classical computing, can be solved in polynomial time using quantum computing. Following Feynman, in 1982, Paul Benio suggested a recognizable theoretical framework for a quantum computer. Tommaso Tooli, David Deutch and Peter Shor are some of the other pioneers of the earlier stage of quantum commutation history.

Bit vs Qubit

It is said that the way we use physical features in computation can decide the success of that computation. Quantum mechanics is a field where physical properties are used to describe those tiny object-level operations. This kind of quantum level particles and their properties can be effectively used to simulate quantum computations.

In classical computing, the basic unit information is a bit, representing either one or zero at one particular time. But in quantum computing, quantum bits are different. A qubit can be one, zero and combination of one and zero at the same time. the classical bit has only one particular state while the qubit can be in different states until we observe its state. Soon after we observe a quantum system, it collapses into a classical system. Relating to a qubit, after an observation the qubit becomes a classical bit (Cbit), having only one state - one or zero.

Graphs

Imagine it as if you do many tasks at once; you are working, playing and dancing, but once you are observed, you can only be doing one of the activities at any given time.

Quantum Entanglement

Another most important property in the quantum world is the story of particle interaction between other particles. Suppose we take two quantum particles to the ends of the universe and we try to do some operation on one of them. Surprisingly, what we do to that particle will be informed to the other particle without a time difference. It implies that they somehow work together as a pair of particles. In another word, they are entangled.

Once, Albert Einstein described this as “spooky action at a distance”. He was reluctant to believe this theory, but the experiments proved him wrong. In essence, if two or more quantum systems are entangled, their separation will also work as one system. According to his Theory of General Relativity, the maximum speed of information traversal does not go beyond the speed of light. But, with quantum entanglement, we can send information to the other counterpart as there is no time limitation.

Measurements

An algorithm embodies a solution for a particular problem. In computing, we define inputs and outputs for an algorithm. Usefully, output parameters tell us what is the given answer. A quantum system deals with the multiple-state nature of qubits. Then, the problem is what should be the answer? Qubit or Cbit ? It should be a classical value. Results should be an exact value. The solution is quantum measurement operators. They harm (collapse) a superposition state and produce an exact answer according to probability (amplitude).

Future with Obstacles

The key problem in implementing a quantum computer is to maintain the fragile quantum state. Quantum state cannot be found at the room temperature. Specially engineered systems have to be made in order to facilitate stable quantum operations. This process of isolating quantum bits takes a huge amount of investment. Space taken by that hardware is also exponentially larger than current computers. This is why some criticizers claim there is no practical usage for quantum devices. However, if we look at the history of the computing, they were also huge and took hours to complete an operation in the early days.

Some researchers debate that quantum computing cannot be considered as a universal computer model which can do all the operations that can likely be done by any laptop. But what they suggest is that those devices can be used in some special domains which have their distinct nature, like famous solving optimization problems in machine learning. For example, they could potentially be used to model and learn to design new drugs for curing diseases like cancers.

D-Wave system is a good example of this type of device used for optimization problems.

Graph
In addition to the D-Wave system, there are more than 17 groups of people working around the world to achieve the same target. Among them, well-known companies like Google, Microsoft, IBM have already invested billions of dollars into R&D. As they say, quantum computing could be the next de facto computer technology.

Quantum Cryptography

Secret key distribution is a major problem in computer security. There are many proposed schemas but they still have some minor issues. The main advantage in quantum key distribution is its ability to identify eavesdropping by nature. A typical quantum system is in superposition (multiple states). But soon after observation, it collapses (into one state). This attribute is used in cryptography to identify a man-in-the-middle attacks via preserving the quantum state until exact receiver of the message.

Yes, quantum system can be much faster in finding an answer to a problem. Cryptographic algorithms like RSA are implemented to secure communication channels based on the size of a secure key. It is so hard to crack those keys with prevailing computer technology. But with quantum speedup and its algorithms, it is logically proven that current keys can be cracked within polynomial time, using the so-called quantum algorithms. Therefore, what we consider as secure now will not be secure in the quantum age of computing.

Conclusion
The field of quantum computing is one of the most interesting and rapidly growing areas among universities, leading IT companies and computing groups. They expect to invest more in this field and do more research in the coming decade and overcome the current obstacles. Even though quantum computers are still in its infancy, they may be in your hand in near future.

Get in touch