Special thanks to Scott Aaronson for answering 3 questions about his recently featured book – Quantum Computing since Democritus

I’m an Associate Professor of Electrical Engineering and Computer Science at MIT, affiliated with CSAIL. My research interests center around the capabilities and limits of quantum computers, and computational complexity theory more generally. – From Scott’s Homepage

Scott’s Homepage: http://www.scottaaronson.com

Scott’s Blog: http://www.scottaaronson.com/blog/

**#1 – What was the impetus for this book?**

The book evolved out of a course I taught at the University of Waterloo in 2006. The point of the course — and the book — was to explore the entire intellectual backdrop to quantum computing: to present it, not merely as a speculative technology, but as an idea that ties together fundamental questions in math, computer science, physics, and philosophy that have been around for decades or centuries, and that serves as a springboard for refining those questions or asking new ones. That’s really the “application” of quantum computing that I care about the most, and it’s one that we get to enjoy today, even before scalable quantum computers are a reality!

I also wrote the book because I wanted to serve what I see as a severely neglected audience: people who are neither experts nor math-phobes, and who want to learn about big, important, difficult ideas without the formalism of a textbook but also without the dumbing-down of a typical popular book. The sorts of readers I had in mind were precocious teenagers, scientists from other fields, or engineers who enjoyed their theoretical courses back in college and are curious to learn what’s new.

**#2 – What is the current fascination with quantum computing? How is it different to regular computing?**

This is a huge question, but briefly, a quantum computer is a computer that would exploit the rules of quantum physics — rules that are deeply unfamiliar to us from everyday life, but that have been known to govern the subatomic world since the 1920s. My one-sentence summary of quantum mechanics would be that it generalizes the laws of probability themselves, to allow numbers called “amplitudes” that can be either positive or negative (or in fact, even complex numbers), and that can therefore “interfere” and cancel each other out. The goal, with quantum computing, is to choreograph a pattern of interference where the amplitudes leading to each wrong answer cancel each other out, while the amplitudes leading to the right answer reinforce each other. With that sort of trick, it’s known how to solve certain specialized problems dramatically faster than the best known algorithms for existing computers. The most famous such problem is that of factoring large integers — something whose presumed hardness is the basis for most current cryptography! Another major application of quantum computers, and arguably an even more important one, would be simulating to quantum physics and chemistry. However, a quantum computer would not be a general-purpose “ubercomputer”: there are many other types of problem where we think it would give us little or no speedup over existing computers.

So, one reason to be fascinated by quantum computing is if you care about the applications: for example, if you’re a cryptographer worried about the security of electronic commerce, or a chemist or physicist who needs to simulate complicated quantum systems. But I, personally, got fascinated for a different reason: because quantum computing takes the question of what are the ultimate limits of computation, and combines it in such a startling way with the question of the nature and validity of quantum mechanics. Building a useful quantum computer would arguably be the most dramatic vindication of quantum mechanics itself that we’ve yet seen.

**#3 – Many of the claims about ‘science of the future’ often seem hyperbolic. Are claims about quantum computing more viable and realistic?**

On the one hand, the *theory* of quantum computing is based on extremely well-established physics: as I said, quantum mechanics has been with us since the 1920s, and has survived every experimental test for a century. Furthermore, at least on paper, we know a great deal about how a quantum computer could operate robustly, thanks to the so-called theory of quantum error-correction. The way I often put the point is this: if quantum computing turned out to be *fundamentally impossible* for some reason (that is, not just expensive or hard, but impossible, like perpetual-motion machines), then that would be far more surprising than if it were possible, since it would force us to revise our whole understanding of modern physics!

But of course, just because something is possible in theory doesn’t mean it can be built in the near future: for example, it took more than a century to get from the idea of a stored-program classical computer to its practical realization, and likewise for heavier-than-air flight. Unfortunately, there’s enormous pressure from investors, funding agencies, and the like to promise a useful quantum computer within (say) the next 5 years, and part of our community succumbs to that pressure, even if they know at some level that it’s totally unrealistic. My fear is that, when the claims of a useful device right around the corner then turn out to be wildly overblown, the entire field will suffer as a result. A good rule of thumb is that, if someone claims they can make money from quantum computing right now, that person is selling snake oil. If, on the other hand, someone says they’re doing basic research that *might* lead to a scalable quantum computer in a hundred years, but that at any rate is interesting science — then not only is that person more honest, but ironically, whatever they’re doing is probably a better bet to have practical applications in 5 or 10 years.

[Image Credit: http://img.mit.edu/newsoffice/images/article_images/original/20120308160903-1.jpg ]

Categories: Computing, History, Interviews, Philosophy, Physics, Quantum Computing, Quantum Mechanics, Technology

You must log in to post a comment.