(NaturalNews) Two competing interests - one interested in boosting its business model and the other a government agency interested in enhancing its code-breaking/encryption capabilities - are each investing in a new computing model that, while still years away, will significantly boost analytical capacity.
Despite the incredible computing power already available in the world, some visionaries - including D-Wave Systems, Inc., the world's first commercial quantum computing company, in which Amazon.com
founder Jeff Bezos and the CIA's venture capital branch, In-Q-Tel have invested - are attempting to take the technology to the next level.Chips can only get so small
Though such computing power does not yet exist, it is likely to in the future, these visionaries are betting, especially if you factor in Moore's Law - a principle named after Intel founder Gordon E. Moore, who observed in a 1965 paper that over the history of computing hardware, the number of transistors on integrated circuits doubles approximately every two years (a period which has since been redefined down to 18 months by Intel executive David House). Transistors also get smaller.
If, according to this principle, "the number of transistors on a microprocessor continues to double every 18 months, the year 2020 or 2030 will find the circuits on a microprocessor measured on an atomic scale," write Kevin Bonsor and Jonathan Strickland over at HowStuffWorks.com
The next logical step, they argue, is quantum computing
. Because chip size is becoming maxed out; chips can only get so small
.Let's talk qubits
Here is one description of how a quantum computer would operate differently - or, maybe more accurately, on a different plane - than a traditional computer:
A classic computer reads binary code -- everything is broken down into a language of ones and zeros. The computer reads these ones and zeros as data and instructions. A quantum computer; however, computes on an atom and not silicone -- in theory... The idea is that a quantum computer creates a quantum state in which quantum bits, or qubits, can exist. A qubit is an odd concept -- unlike binary bits, qubits can be a one, a zero or both a one and a zero at the same time (the latter is a state called "superposition" in quantum theory).
Researchers go on to theorize that with enough qubits, you would be able to process highly complex problems in a fraction of the time it currently takes computers to complete. In fact, a quantum computer, it is believed, would be able to calculate all possible outcomes simultaneously, ranking results by percentage. One result might have a 95 percent chance of being correct, while all others added together would comprise the remaining five percent.
"So far, quantum computers have proven difficult to maintain in any practical sense. One problem stems from interference -- basically any vibrations that are strong enough to harm the spin of an electron being used in quantum computing," says HowStuffWorks.com
.The future looks bright
Vern Brownell, CEO of D-Wave, a British Columbia-based firm, expressed confidence the quantum computing (QC) mystery could be solved.
"Jeff Bezos and In-Q-Tel are well-known visionaries. Both understand the implications of quantum computing as a world
changing force, and these investments affirm their belief in D-Wave's unique approach to quantum computing," he said in a news release.
Much of the interest surrounding development of the technology is intelligence-based.
"Our intelligence community customers have many complex problems that tax classical computing architecture," said Robert Ames, Vice President in charge of Information and Communication Technologies at IQT. "We believe our customers can benefit from the promise of quantum computing, and this investment in D-Wave is a first step in that direction."
Earlier this summer, IBM announced it had developed the world's fastest supercomputer, outperforming Fujitsu's K Computer, by demonstrating it could perform at 8.162 petaflop/s - which is 8.162 quadrillion floating-point operations per second.
Have comments on this article? Post them here:
people have commented on this article.