21 Nov 2016
Quantum computing is heralded as the next revolution in terms of global computing. Google, Intel and IBM are just some of the big names investing millions currently in the field of quantum computing which will enable faster, more efficient computing required to power the requirements of our future computing needs.
|Now a researcher and his team at Tyndall National Institute in Cork have made a ‘quantum leap’ by developing a technical step that could enable the use of quantum computers sooner than expected.|
|Conventional digital computing uses ‘on-off’ switches, but quantum computing looks to harness quantum state of matters – such as entangled photons of light or multiple states of atoms – to encode information. In theory, this can lead to much faster and more powerful computer processing, but the technology to underpin quantum computing is currently difficult to develop at scale.|
|Researchers at Tyndall have taken a step forward by making quantum dot light-emitting diodes (LEDs) that can produce entangled photons (whose actions are linked), theoretically enabling their use to encode information in quantum computing.|
|This is not the first time that LEDs have been made that can produce entangled photons, but the methods and materials described in the new paper (Nature Photonics, “Selective carrier injection into patterned arrays of pyramidal quantum dots for entangled photon light-emitting diodes”) have important implications for the future of quantum technologies, explains researcher Dr Emanuele Pelucchi, Head of Epitaxy and Physics of Nanostructures and a member of the Science Foundation Ireland-funded Irish Photonic Integration Centre (IPIC) at Tyndall National Institute in Cork.|
|Dr Emanuele Pelucchi.|
|“The new development here is that we have engineered a scalable array of electrically driven quantum dots using easily-sourced materials and conventional semiconductor fabrication technologies, and our method allows you to direct the position of these sources of entangled photons,” he says.|
|“Being able to control the positions of the quantum dots and to build them at scale are key factors to underpin more widespread use of quantum computing technologies as they develop.”|
|The Tyndall technology uses nanotechnology to electrify arrays of the pyramid-shaped quantum dots so they produce entangled photons. “We exploit intrinsic nanoscale properties of the whole “pyramidal” structure, in particular, an engineered self-assembled vertical quantum wire, which selectively injects current into the vicinity of a quantum dot,” explains Dr Pelucchi.|
|“The reported results are an important step towards the realization of integrated quantum photonic circuits designed for quantum information processing tasks, where thousands or more sources would function in unison.”|
|“It is exciting to see how research at Tyndall continues to break new ground, particularly in relation to this development in quantum computing. The significant breakthrough by Dr Pelucchi advances our understanding of how to harness the opportunity and power of quantum computing and undoubtedly accelerates progress in this field internationally. Photonics innovations by the IPIC team at Tyndall are being commercialized across a number sectors and as a result, we are directly driving global innovation through our investment, talent and research in this area,” said Dr Kieran Drain, CEO at Tyndall National Institute.|
|Source: Tyndall National Institute|
28 Jul 2015
MIT is a key player in a new $600 million public-private partnership announced today by the Obama administration to help strengthen high-tech U.S.-based manufacturing.
Physically headquartered in New York state and led by the State University of New York Polytechnic Institute (SUNY Poly), the American Institute for Manufacturing Integrated Photonics (AIM Photonics) will bring government, industry, and academia together to advance domestic capabilities in integrated photonic technology and better position the U.S. relative to global competition.
Federal funding of $110 million will be combined with some $500 million from AIM Photonics’ consortium of state and local governments, manufacturing firms, universities, community colleges, and nonprofit organizations across the country.
Technologies that can help to integrate photonics, or light-based communications and computation, with existing electronic systems are seen as a crucial growth area as the world moves toward ever-greater reliance on more powerful high-tech systems. What’s more, many analysts say this is an area that could help breathe new life into a U.S. manufacturing base that has been in decline in recent years.
The public-private partnership announced today aims to spur these twin goals, improving integration of photonic systems while revitalizing U.S. manufacturing. The consortium includes universities, community colleges, and businesses in 20 states. Six state governments, including that of Massachusetts, are also supporting the project.
MIT faculty will manage important parts of the program: Michael Watts, an associate professor of electrical engineering and computer science, will lead the technological innovation in silicon photonics. Lionel Kimerling, the Thomas Lord Professor in Materials Science and Engineering, will lead a program in education and workforce development.
“This is great news on a number of fronts,” MIT Provost Martin Schmidt says. “Photonics holds the key to advances in computing, and its pursuit will engage and energize research and economic activity from Rochester, New York, to Cambridge, Massachusetts, and beyond. MIT faculty are excited to contribute to this effort.”
An ongoing partnership
MIT’s existing collaboration with SUNY Poly led to the first complete 300-millimeter silicon photonics platform, Watts says. That effort has led to numerous subsequent advances in silicon photonics technology, with MIT developing photonic designs that SUNY Poly has then built in its state-of-the-art fabrication facility.
Photonic devices are seen as key to continuing the advances in computing speed and efficiency described by Moore’s Law — which may have reached their theoretical limits in existing silicon-based electronics, Kimerling says. The integration of photonics with electronics promises not only to boost the performance of systems in data centers and high-performance computing, but also to reduce their energy consumption — which already accounts for more than 2 percent of all electricity use in the U.S.
Kimerling points out that a single new high-performance computer installation can contain more than 1 million photonic connections between hundreds of thousands of computer processor units (CPUs). “That’s more than the entire telecommunications industry,” he says — so creating new, inexpensive, and energy-efficient connection systems at scale is a major need.
The integration of such systems has been progressing in stages, Kimerling says. Initially, the conversion from optical to electronic signals became pervasive at the network level to support long-distance telecommunication, but it is now moving to circuit boards, and will ultimately go to the level of individual integrated-circuit chips.
“Europe is ahead in industry coordination right now,” following a decade of government investment, Kimerling says. This new U.S. initiative, he says, is “one of the first of this kind in the U.S., and the bet is that the innovation and research here, combined with the manufacturing capability, will allow our companies to really take off.”
Leadership in technological innovation
Within the new alliance, MIT will lead technological innovation in silicon photonics. That task will be managed by Watts.
The evolving integration of photonics and electronics will have a great impact on many different technologies, Watts says. For example, LIDAR systems — similar to radar, but using light beams instead of radio waves — have great potential for collision-avoidance systems in cars, since they can provide greater detail than radar or sonar. Watts has worked to develop single-chip LIDAR devices, which could eliminate the moving parts in existing devices — such as tiny gimbaled mirrors used to direct the light beams in a scanning pattern — replacing them with fixed, electrically steerable phased-array systems, like those now used for cellphone tower antennas.
“LIDAR systems that exist today are both bulky and expensive, because they use mechanically scanned lasers,” Watts says. But doing the same thing at the nanoscale, using phased-array systems on a chip, could drastically reduce size and cost, providing high-resolution, chip-scale, 3-D imaging capabilities that do not exist today, he says.
There are many other areas where integration of photonics and electronics could lead to big advances, including in biological and chemical sensors that could have greater sensitivity than existing electronic versions, and in new kinds of medical imaging systems, such as optical coherent tomography.
“The goal of this initiative is to lower the barriers to entry in this field for U.S. companies,” Watts says. It is intended to function much like a major public-private initiative that helped pave the way, decades ago, for the development of electronic chip manufacturing in the U.S.
Significant photonic chip manufacturing capabilities have been developed at SUNY Poly, in Albany, New York. That facility has already made the world’s largest silicon-based photonic circuit, a chip designed at MIT, and built using industry-standard 300-millimeter-wide silicon wafers, Watts says.
Contributions in education and training
MIT will also host AIM Photonics’ program in education and workforce development, which Kimerling will direct. This will include developing educational materials — ranging from K-12 through continuing education — to prepare future employees for this emerging industry, including teaching on the design of integrated photonic devices. MIT will also lead workforce development, with an emphasis on including veterans, underrepresented minorities, and other students, by developing a variety of materials to teach about the new technologies.
MIT will work to support internships, apprenticeships, and other forms of hands-on training in a national network of industry and university partners. The effort will also support an industry-wide roadmap to help align the technology supply chain with new manufacturing platforms.
Kimerling says that a significant issue in developing a robust photonics industry is the need to develop a trained workforce of people who are familiar with both electronics and optical technology — two very different fields. Educational programs that encompass these disparate fields “are important, and don’t exist today in one organization,” he says.
One expected impact of the new initiative is the development of a corridor along Interstate 90, from Boston to Rochester, New York, of industrial firms building on the base of new technology to develop related products and services, much as Silicon Valley emerged in California around companies such as Intel and their chip-making technology.
Other major members of AIM Photonics include the University of Arizona, the University of Rochester, and the University of California at Santa Barbara. In addition to the Department of Defense, federal funding for the project will come from the National Science Foundation, the Department of Energy, the National Institute for Standards and Technology, and NASA.
Roots in the Advanced Manufacturing Partnership
Today’s news flows from the work of the Advanced Manufacturing Partnership (AMP), a White House-led effort begun in 2011 with the aim of bringing together industry, universities, and the federal government to identify and invest in key emerging technologies, with the idea of stoking a “renaissance in American manufacturing.”
AMP was inaugurated with former MIT President Susan Hockfield as co-chair; MIT President L. Rafael Reif subsequently served in that same capacity as part of “AMP 2.0.” Those groups’ work led to President Barack Obama’s commitment to establish a National Network of Manufacturing Innovation, to consist of linked institutes such as the one announced today.
“Massachusetts’ strong role in the AIM Photonics team stems from a collaboration involving MIT and many other partner organizations across the Commonwealth: universities, community colleges, and large and small manufacturers throughout the integrated photonics supply chain,” says Krystyn Van Vliet, a professor of materials science and engineering and biological engineering, and MIT faculty lead for AMP 2.0. “The support of Gov. Charlie Baker and Secretary of Housing and Economic Development Jay Ash was key to the success of the AIM Photonics team, and we appreciate their efforts. This manufacturing institute will help Massachusetts inspire and prepare the next generation of integrated photonics manufacturing careers, businesses, and leaders.”
“Today’s announcement is a testament to the outstanding team of industrial and academic leaders assembled by AIM Photonics and its plan to establish the U.S. as a global leader in this emerging technology,” says Michael Liehr, AIM CEO and SUNY Poly executive vice president of innovation and technology and vice president of research. “This would not have been possible without the critical support of Gov. Andrew Cuomo, whose pioneering leadership in establishing New York state’s globally recognized, high-tech R&D ecosystem has enabled historic economic growth and innovation and secured our partnership with the state of Massachusetts. SUNY Poly is excited to be working with partners such as MIT on this initiative, which will be truly transformational for both the industry and the nation.”
Georgia Tech Research Institute (GTRI) scientists work in an optical lab developing improved ion traps that could be used in quantum computing. Shown are (l-r) research scientists Jason Amini and Nicholas Guise. (Credit: Rob Felt)
“To write down the quantum state of a system of just 300 qubits, you would need 2^300 numbers, roughly the number of protons in the known universe, so no amount of Moore’s Law scaling will ever make it possible for a classical computer to process that many numbers,” said Nicholas Guise, a GTRI research scientist who led the research. “This is why it’s impossible to fully simulate even a modest sized quantum system, let alone something like chemistry of complex molecules, unless we can build a quantum computer to do it.”
While existing computers use classical bits of information, quantum computers use “quantum bits” or qubits to store information. Classical bits use either a 0 or 1, but a qubit, exploiting a weird quantum property called superposition, can actually be in both 0 and 1 simultaneously, allowing much more information to be encoded. Since qubits can be correlated with each other in a way that classical bits cannot, they allow a new sort of massively parallel computation, but only if many qubits at a time can be produced and controlled. The challenge that the field has faced is scaling this technology up, much like moving from the first transistors to the first computers.
One leading qubit candidate is individual ions trapped inside a vacuum chamber and manipulated with lasers. The scalability of current trap architectures is limited since the connections for the electrodes needed to generate the trapping fields come at the edge of the chip, and their number are therefore limited by the chip perimeter.
The GTRI/Honeywell approach uses new microfabrication techniques that allow more electrodes to fit onto the chip while preserving the laser access needed.
The team’s design borrows ideas from a type of packaging called a ball grid array (BGA) that is used to mount integrated circuits. The ball grid array’s key feature is that it can bring electrical signals directly from the backside of the mount to the surface, thus increasing the potential density of electrical connections.
The researchers also freed up more chip space by replacing area-intensive surface or edge capacitors with trench capacitors and strategically moving wire connections.
The space-saving moves allowed tight focusing of an addressing laser beam for fast operations on single qubits. Despite early difficulties bonding the chips, a solution was developed in collaboration with Honeywell, and the device was trapping ions from the very first day.
The team was excited with the results. “Ions are very sensitive to stray electric fields and other noise sources, and a few microns of the wrong material in the wrong place can ruin a trap. But when we ran the BGA trap through a series of benchmarking tests we were pleasantly surprised that it performed at least as well as all our previous traps,” Guise said.
Working with trapped ion qubits currently requires a room full of bulky equipment and several graduate students to make it all run properly, so the researchers say much work remains to be done to shrink the technology. The BGA project demonstrated that it’s possible to fit more and more electrodes on a surface trap chip while wiring them from the back of the chip in a compact and extensible way. However, there are a host of engineering challenges that still need to be addressed to turn this into a miniaturized, robust and nicely packaged system that would enable quantum computing, the researchers say.
In the meantime, these advances have applications beyond quantum computing. “We all hope that someday quantum computers will fulfill their vast promise, and this research gets us one step closer to that,” Guise said. “But another reason that we work on such difficult problems is that it forces us to come up with solutions that may be useful elsewhere. For example, microfabrication techniques like those demonstrated here for ion traps are also very relevant for making miniature atomic devices like sensors, magnetometers and chip-scale atomic clocks.”
The race to make the first quantum computer is becoming as important as the race 75 years ago to get the first nuke. It could change the balance of power in politics and business.
Quantum computers have long been theoretically possible but a kind of futuristic fantasy, like Interstellar-style wormhole travel, or zero-calorie Hershey Bars. I first wrote in the 1990s about the quest for one.
Now breakthroughs are coming faster, and scientists say we’re 15 to 20 years away from fully functional, programmable quantum computers.
This technology will make the microprocessor in your laptop seem as sophisticated as a booger. The silicon-based technology inside today’s computers, which engineers have constantly made faster and cheaper for five decades, is running out of ways to get better. Quantum computers will herald, well, a quantum leap—like riding a horse one day and getting into a fighter jet the next. These machines will be millions of times more powerful than today’s fastest supercomputers, solving problems that now elude solving, like dead-on accurate weather prediction or modeling protein molecules for medical research.
A quantum computer could also create indestructible encryption, and unlock any existing computer security as easily as you unzip your fly. We’re entering an era of cyberwar, so imagine how power might shift if one country gets the ability to invade any other country’s computer systems while putting up the ultimate computer defenses. That’s a major reason nations are pouring money into this research. The U.K., China, Russia, Australia, Netherlands and other countries are in the game. In the U.S., the CIA, National Security Agency and Pentagon are all funding research, while Los Alamos National Laboratory operates one of the most significant quantum computer labs.
Negotiations to keep nuclear weapons from Iran are certainly critical, but if you play out the promise of quantum computing, an American machine could bust into Iranian systems and shut down all that country’s nuclear activity in an instant. It’s like a game of rock-paper-scissors: Nukes might be the world’s version of a rock, but quantum computers would be paper, winning every time.
And yet, quantum computing research isn’t self-contained and secretive in the manner of the Los Alamos atomic bomb work during World War II. Some of it is academic work at universities such as the Massachusetts Institute of Technology, with findings shared in scientific papers. Technology companies are working on this, too, since these things have the potential to be business nukes. IBM, Google and Microsoft all fund research. Imagine if Google gets one before Microsoft. That pesky Bing could wind up vaporized. Google has a Quantum Artificial Intelligence unit working with the University of California, Santa Barbara, with a goal of developing a quantum machine that can learn.
Meanwhile, a Canadian startup, D-Wave Systems, is partially funded by Amazon CEO Jeff Bezos—and the CIA. The very secretive and often controversial company is already marketing a hybrid machine that seems to be a traditional silicon computer with some sort of quantum turbo thruster.
There’s no telling, yet, how this technology will emerge, or whether it will be hoarded by a few nations (like nukes) or spread around the globe (like computers). “There’s a healthy mix of cooperation and competition,” IBM quantum computing scientist Jerry Chow tells me. “But there’s starting to be more competition.”
Keep in mind that this technology is really hard, and really, really weird. A quantum computer makes calculations using the spin of special atoms, called qubits, and it relies on bizarre quantum physics properties like multiple parallel universes. Quantum computing is so fast because it calculates all possible answers at the same time. To borrow a metaphor from D-Wave CEO Vern Brownell, let’s say you had to find an X written on one page among the 37 million books in the Library of Congress. A typical computer today would look at every page, one at a time—very quickly, but still in a serial process. A quantum computer could look at every page at the same time—as if it were splitting the task into a billion parallel universes, finding the answer, then coming back to ours to show us where that X is.
Scientists have managed to get a few qubits to do calculations in labs, but we are far from getting a stable, programmable, fully quantum machine. On April 29, IBM announced what it says is a significant advance—a way to detect and measure the two types of quantum errors, called bit-flip and phase-flip, at the same time. That sounds esoteric to most of us, but it will help with a peculiar problem that vexes researchers: The very act of looking at a qubit to get its answer can make the qubit change its answer. So some mechanism needs to figure out whether we’re seeing the right answer. Again—this stuff is really weird.
The various labs often disagree on the best way to build a quantum computer, and the art of programming qubits is as big a challenge as making the machine in the first place. Today’s software is based on algorithms, which are linear, one-step-at-a-time calculations. Top mathematicians aren’t yet sure how to write algorithms that calculate everything at the same time. It’s like trying to come up with a recipe for an apple pie in which all the ingredients combine in the pan in the same split second.
Yet hard as this work seems, scientists have become sure that the first quantum computers are within reach. Expect quantum computing news to keep coming. And it might not be too early to prepare for quantum-era life a couple decades from now. If you’re already worried that artificial intelligence will take your job, quantum AI will seem terrifying. Your Google self-driving car will be smarter than your whole department.