What is quantum?
Quantum is a word practically we all know that but as relatively some can understand it properly. Because it has multiple meanings in various contexts, but more, it’s because the word is so often misappropriated. Quantum is the Latin word (meaning ‘how great’) for the amount and, in modern understanding, means the smallest possible discrete unit of any physical property such as energy or matter. In physics, a quantum ( plural quanta) is a discrete quantity of energy proportional in magnitude to the frequency of the radiation it represents.
What is quantum computing?
Quantum computing, one of the “jazziest and most mysterious concepts” in science, has struggled to come of age. Traditional models of computing such as the Turing machine or Lambda calculus rely on “classical”. Quantum computing is the study of a non-classical model of computation. Most of the traditional models are representations of computational memory. This is a system that can be found in one of a finite set of possible states, each of which is physically distinct. It is frequently convenient to represent the state of this memory as a string of symbols; most simply, as a string of symbols 0 and 1. In this term, the fundamental unit of memory is called a bit. A quantum computation could transform the memory into a quantum superposition of possible classical states. In the quantum scenario, the fundamental unit of memory is called a qubit. In quantum computing, quantum supremacy is the goal of demonstrating that a programmable quantum device can solve a problem that classical computers practically cannot. A quantum device can solve a problem correctly faster than classical computers. The field of quantum computing is closely related to quantum information science, which includes quantum cryptography and quantum communication. By quantum computing, a device could take 200 seconds for a series of operations where a supercomputer would take 10000 years to complete. A quantum computer is a device that could perform quantum computation.
Quantum to quantum computing :
The revolution idea about quantum started in the mind of Max Planck. German theoretical physicist Max Planck invented energy quantum in 1900. He solved the black body radiation problem. He won Nobel prize in 1918 due to the quantum theory. Neil Bohr was a Danish physicist who made foundational contributions to understanding the atomic structure and quantum theory, for which he received the Nobel Prize in Physics in 1922. Max Planck and Neil Bhor both consider finding the father of quantum theory each received a novel prize in physics for their work in quanta. Einstein is considered the third founder of Quantum Theory because he described light as quanta in his theory of the Photoelectric Effect, for which he won the 1921 Nobel Prize.
In 1960 Stephen Wiesner invented conjugate coding. He is a research physicist who discovered several of the most important ideas in quantum information theory in 1960-1970.
David Finkelstein was a professor of physics at the Georgia Institute of Technology. His most of the work is directed toward a quantum theory of space-time structure. A quantum computer with spins as quantum bits was also formulated for use as a quantum spacetime in 1968. Alexander Holevo is a Soviet and Russian mathematician, one of the pioneers of quantum information science. In 1973 he published a paper showing that n qubits cannot carry that more than n classical bits of information. Charles H. Bennett shows that computation can be done reversibly. Charles Henry Bennett is a physicist, information theorist and IBM Fellow at IBM Research. He discovered, with Gilles Brassard, the concept of quantum cryptography and is one of the founding fathers of modern quantum information theory. Polish mathematical physicist Roman Stanisław Ingarden publishes a seminal paper entitled “Quantum Information Theory” in Reports on Mathematical Physics, vol. 10, 43–72, 1976.
In 1980 Paul Benioff described quantum mechanical Hamiltonian models of computers. Yuri Manin proposed an idea of quantum computing. In 1981 At the First Conference on the Physics of Computation, held at MIT in May, Paul Benioff and Richard Feynman give talks on quantum computing. In Feynman’s talk, he observed that it appeared to be impossible to efficiently simulate the evolution of a quantum system on a classical computer, and he proposed a basic model for a quantum computer. In 1982 Paul Benioff proposes the first recognizable theoretical framework for a quantum computer. In 1985 David Deutsch, at the University of Oxford, describes the first universal quantum computer. Just as a Universal Turing machine can simulate any other Turing machine efficiently (Church-Turing thesis), so the universal quantum computer is able to simulate any other quantum computer with at most a polynomial slowdown.
So in general, the field of quantum computing was initiated by the work of Paul Benioff and Yuri Manin in 1980, Richard Feynman in 1982, and David Deutsch in 1985. Around 1960-1973 the idea was beginning to form, but the field really started spreading in the 1980s. The field has been changing and evolving rapidly throughout 1980-2000 and keeps evolving.
A quantum machine is a result of more than a century’s worth of research and where thousand of researchers has been involving with it. It has been talking about science from the last century. There is an increasing amount of investment in quantum computing by governments, established companies, and start-ups. There is all big-name are involved such as IBM, Google, Alibaba, Hewlett Packard, Tencent, Baidu, and Huawei all doing their own research.