File Copyright Online - File mutual Divorce in Delhi - Online Legal Advice - Lawyers in India

Quantum Computing: The Future of Computing

The main idea of this research paper is to study some applications of quantum computation and to examine the interplay between quantum theory and AI. For the people who are not aware of quantum computing, a brief introduction to quantum computing is given, and a brief history of it also given with some comparison between classical and quantum computing.

Introduction
Quantum theory is without any doubt one of the greatest scientific achievements of the 20th century.[5] It has represented a new line of scientific thought which has estimated totally unacceptable situations and has influenced many domains of modern technologies. There are many different ways for conveying laws of physics in particular.

Similarly, physical laws of nature say that information can be expressed in different ways. The fact that information can be conveyed in other ways without losing its vital identity which leads to the probability of the automatic manipulation of data.

All the ways of presenting information by the use of a physical system like spoken words are converse by air pressure wavering. The fact that information does not care that how it is conversed and can be easily translated from one state to another, that became an obvious candidate for an important role in physics, like energy, momentum and other such topics.

Quantum mechanics is after various technologies that we took for granted. Transistors in mobile, the LEDs in torches, and MRI machines which doctors use to look inside the human body are few instances. Other function of quantum technology may show some guideline to do things which are currently not possible with today’s technology. Quantum computing is based on a different method for storing and processing information.

A classical computing bit presents a logical value of 0 and 1. Quantum mechanics provide much more broad way to store a piece of information by allowing a quantum bit which is known as a qubit, to store the probability that a specific qubit will be either 0 or 1, with the exact value of the qubit is not known till it is measured.

Like a situation where you flip a coin. When a coin is in the air, you know that the probability of heads is 0.5 and the probability tails are 0.5. But when you hold the coin and look at it, you very well know which side came up. One of the ways to depict the state of the spinning coin is that it is both heads and tails at the same time.

As same in the mathematical calculation of quantum mechanics, where particles like electron or proton are always revolving and you don’t know the state of a particle until you measure its property. Also, if you know the probability that a particle is in one of the multiple states, you can think of that particle as continuously being in all those states at the same time.
A Qubit is a Quantum bit, it is equivalent to the binary digit or bit of classical computing in quantum computing.

By increasing this idea of qubits, you can use N numbers qubits to simultaneously store the probability that the system is in any of the possible 2N states. This is mostly interpreted that with N numbers of qubits, a system can store all 2N possible N-bit values immediately.

That is a progress in the capability of classical bits, where an N bit register can store a particular one of the 2N possible values simultaneously. There are around 1078 to 1082 particles(atoms) visible in the world, so only a single register of 265 qubits can hold about as many values as there are atoms in the world or universe

History of quantum computing

The idea of quantum computing was hit by Richard Faynman. In 1981 at MIT, he described the difficult situation where classical computers cannot imitate the progression of the quantum systems in a systematic way. Thus, he came up with an elementary model for the quantum computers that have the potential for such stimulations. It took more than 10 years to change the view of quantum computing until a special algorithm was created i.e., the Shor algorithm.

Then in 1994, Peter Shor generated an algorithm which let the quantum computers to precisely factorize large integers exponentially and more smoothly than the classical algorithm on conventional machines. The later took millions of years to factorize 300 digit number.

Since 1945 we have been witnessing a rapid growth of the raw performance of computers with respect to their speed and memory size. An important step in this development was the invention of transistors, which already use some quantum effects in their operation.[1]

Then in 1996, Lov Grover discovered a quantum database search algorithm that introduced a quadratic speedup for a variety of complications. Any difficulty which has to be solved by random or normal force search could be done 4 times faster.

In 1998, a working 2 qubit quantum computer was assembled and resolved first quantum algorithms such as Grover’s algorithm. The revival into a new era of computer power initiated and more and more applications were presented.

20 years later, in 2017, IBM furnished the first commercially operational quantum computer, boost the competition to another level.

Classical vs quantum computing

Computers have been in use since the early 19th century. Now we are currently in the 4th generation of computers where we are using microprocessors after vacuum tubes, transistors and integrated circuits. They are all based on classical computing which is depended on the classical phenomenon of electrical circuits being in a single state at a given time, it's either on or off.

The 5th generation of computers is basically underdevelopment in which quantum computing is the most popular. the working of quantum computers is totally different from classical computers. Unlike classical computers, quantum computers are based on the phenomena of quantum mechanics where it's possible to be in more than one state at the given time. The quantum computation solves the problem with certainty in exponentially less time than any classical deterministic computation.[4]
 
Classical Computing Quantum Computing
It is based on traditional fact of electrical circuits being in single state at a given time, either on or off. It is based on the fact of quantum mechanics where it is possible to be in more than one state at a time.
Information storage and operation is based on “bit”, which is based on voltage or charge: low is 0 and high is 1. Information storage and operation is based on “Quantum Bits” or “qubits”, which is based on the spin of the electron.
The circuit behavior is regulated by traditional physics. The circuit behavior is regulated by quantum physics or quantum mechanics.
It uses binary code i.e., 0 or 1 to represent the information. It uses qubits i.e., 0, 1 and superposition state of both 0 and 1 to represent information.
CMOS transistors are the primary building blocks of conventional computers. SQUID or Quantum transistors are the primary building blocks of quantum computers.
Data processing is done in CPU (Central Processing Unit). Data processing is done QPU (Quantum Processing Unit).
Table 1- Comparison between classical and quantum computing

Applications of quantum computing

  1. Error Corection

    Quantum computing uses QEC (Quantum Error Correction) to protect the quantum information from errors due to Quantum decoherance and other Quantum noises. QEC provides a means to detect and undo such departures without upsetting the quantum computation.[2] QEC is crucial if one has to attain fault-tolerant quantum computation that can allocate not only with noise on the stored quantum information but also with the weak quantum preparation and wrong measurements.

    Copying the quantum information is difficult due to the no-cloning theorem. This theorem seems to present a complication in formulating the theory of quantum error correction. Peter Shor, introduced the method of formulating a quantum error correction code by storing the information of one Qubit onto an extremely jumbled state of nine qubits. A quantum error-correcting code saves guard quantum information against the errors of limited forms.
     
  2. Hacking

    The ability to perform computations on encrypted data is a powerful tool for protecting a client’s privacy, especially in today’s era of cloud and distributed computing. In terms of privacy, the best solutions that classical techniques can achieve are unfortunately not unconditionally secure in the sense that they are dependent on a hacker’s computational power.[3]

    In general, there is an algorithm which executes on a quantum computer which decreases the security of a 3,072-bit RSA key down to about 26 bits. It is basically not possible with a classical technology that will be available in the expected future to decode a key that provides 128 bits of security, but somebody can simply decode key that provides only 26 bits of security with the computing power of mobile.

    If engineers discover how to build a large-scale quantum computer, the security provided by the RSA algorithm will basically disappear, just like the security provided by the other public-key encryption algorithms.

    The security more or less of all the public key encryption algorithms which are broadly being used nowadays will reduce to effectively zero if a hacker has to access to large quantum computers.

    But there are many known public-key encryption algorithms which are secure from attacks by quantum computers. Also, some of them are examined and checked by reputed standard organizations-IEEE Std 1363.1 and OASIS KMIP (PDF) has already identified quantum-safe algorithms. So, if progress in quantum computing terrorizes to make current public-key algorithms hackable, it will be easy to move to quantum-safe algorithms.

    The attacks that can execute on quantum computers simply by dividing the numbers of bits of security which AES key provides. Example of AES keys - a 256-bit AES key which provides 128 bits of security, etc. So, if a system is already using AES-256 key then the system is already using an encryption algorithm which will provide sufficient security in opposition to quantum computers.

    Basically, it will be still possible to communicate securely in the environment of attackers who has big quantum computers.
     
  3. Quantum Parallelism

    Parallel computing is a type of computing architecture in which many processors runs or execute an application or computation at the same time. Parallel computing helps in executing huge computation by dividing the amount of work between more than one processor, which works simultaneously. Parallel computing is known as parallel processing.

    The most interesting new feature of quantum computing is quantum parallelism. A quantum computing, in general, consisting of a superposition of many classical or classical-like states. This superposition is not just an expression but also covering up our ignorance of which classical state it is really in. If superposition meant all that you can drop all except one of the classical-like states and still get the time for evolution.

    But in actual you need the complete superposition to get the time evolution right. The system is in some sense of the classical-like states at once. If the superposition can be secured from the unnecessary mess in its atmosphere known as decoherence. A quantum computer can show the result dependent on the information of all classical-like states.

    This is known Quantum Parallelism: parallelism on a serial machine and if that isn't enough, machines that are by now are in architectural terms will qualify as parallel which can benefit from quantum parallelism too.


Interplay between quantum computing and Artificial Intelligence

Quantum computing has the power to improve the artificial intelligence system in the coming future. Like, a Quantum computer could develop Artificial intelligence-based digital assistant with real contextual awareness and have the ability to understand interaction with people.

There hopes that quantum computing high computation power will someday meet the exponential phenomena in AI. AI system thrives when the machine learning algorithms are used to train them and are given huge amounts of data to store, identify and analyze, and more particularly, data can be arranged or classified according to specific features for the better the AI will performance. Quantum computing is expected to play an important role in machine learning even including the important aspect of accessing more computationally complex feature spaces.

Researchers are trying to figure out a way to speed up these processes by applying quantum computing algorithms to AI techniques which are increasing the process to a new discipline that has been dubbed Quantum Machine Learning (QML).

Like, the voice assistant could be effective from this implementation, because quantum could exponentially help in increasing their accuracy, boosting both of their processing power and the amount of data that would be able to handle. Quantum computing increases the number of calculation variables machines can juggle and therefore allow them to provide faster answers, much like a person would.

Conclusion
Quantum computing guarantees the capability to define solutions to the problems for all practical purposes which are still aren’t resolvable by classical computers. However, quantum computing still has a long journey from gaining practical attention. Some possessions of quantum mechanics that allow quantum computers superior presentation also make the design of quantum algorithms and the establishment of functional hardware extremely difficult. We need to imply some solutions to refine the quality of qubit technology by enlarging the coherence time of qubits and the speed of quantum operations. We also desired to perfect the state of the qubit for quantum error correction

References:
  1. Gruska, J. (1999). Quantum computing (Vol. 2005). London: McGraw-Hill.
  2. QEC provides a means to detect and undo such departures without upsetting the quantum computation.
  3. Marshall, K., Jacobsen, C. S., Schäfermeier, C., Gehring, T., Weedbrook, C., & Andersen, U. L. (2016). Continuous-variable quantum computing on encrypted data. Nature communications, 7(1), 1-7.
  4. Deutsch, D., & Jozsa, R. (1992). Rapid solution of problems by quantum computation. Proceedings of the Royal Society of London. Series A: Mathematical and Physical Sciences, 439(1907), 553-558.
  5. Ying, M. (2010). Quantum computation, quantum theory and AI. Artificial Intelligence, 174(2), 162-176.
Written By: Himanshi Bhatia - The Author has completed her B.C.A from Guru Gobind Singh Indraprastha university, New Delhi ,with distinction in 2020 and now she is pursuing  her MBA from NMIMS. The author can be reached at [email protected]

Law Article in India

You May Like

Lawyers in India - Search By City

Copyright Filing
Online Copyright Registration


LawArticles

How To File For Mutual Divorce In Delhi

Titile

How To File For Mutual Divorce In Delhi Mutual Consent Divorce is the Simplest Way to Obtain a D...

Increased Age For Girls Marriage

Titile

It is hoped that the Prohibition of Child Marriage (Amendment) Bill, 2021, which intends to inc...

Facade of Social Media

Titile

One may very easily get absorbed in the lives of others as one scrolls through a Facebook news ...

Section 482 CrPc - Quashing Of FIR: Guid...

Titile

The Inherent power under Section 482 in The Code Of Criminal Procedure, 1973 (37th Chapter of t...

The Uniform Civil Code (UCC) in India: A...

Titile

The Uniform Civil Code (UCC) is a concept that proposes the unification of personal laws across...

Role Of Artificial Intelligence In Legal...

Titile

Artificial intelligence (AI) is revolutionizing various sectors of the economy, and the legal i...

Lawyers Registration
Lawyers Membership - Get Clients Online


File caveat In Supreme Court Instantly