Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Q – Initiative to build commercially available universal quantum computers (ibm.com)
212 points by davidyapdy on May 21, 2017 | hide | past | favorite | 63 comments


I tried testing the Bernstein Vazarani algorithm on a 5 qubit quantum computer that IBM let us use for an afternoon for one of my university classes last semester. It was unable to recover the hidden string, even though simulating the circuit classically recovered it. Anyone can launch a "quantum computer as a service" platform, but the service won't be valuable if the computer doesn't work. And so far no quantum computer has solved a problem faster than we can do classically


> but the service won't be valuable if the computer doesn't work.

So the state-of-the-art in quantum computing is still basically "can tell you the 200th prime number, sometimes"?

Not very useful.


But this is new territory, and researchers shouldn't give up just because it doesn't work very well now.

I recall that the iPhone/iPad were preceded by attempts at tablet computing that were very crude in comparison.

Give it a couple of years and see where it leads.


The current state of the art in QC is more like Alan Kay's Dynabook paper: https://en.wikipedia.org/wiki/Dynabook

i.e. even the "very crude attempts" stage (your Newtons or Palm Pilots) is still science fiction.


> I recall that the iPhone/iPad were preceded by attempts at tablet computing that were very crude in comparison.

That's a very poor analogy. Here we are talking about stuff that is not even functional, technology wise.


Poor poor analogy, but at least you get the picture.

They are talking 15- and 17-qubit-capable processors today, but are looking at 50-qubit ones in a few years.


The key question is whether QC algorithms are scaling as predicted by QC theory, or at least better than classical algorithms for the same task. If the error correction is causing them to scale poorly, the design is probably infeasible for large-qubit calculations.

I expect it is scaling poorly, or IBM would have reported otherwise when moving from 5- to 16-qubit machines.


That sounds right, but also like an argument against commercial availability just yet in my ears.


Even if this is the case, at first people were scratching their head looking at the transistor what it might be good for.


No, the transistor replaced the vacuum tube, which everyone was already using. When the transistor was invented everyone rushed to use it because it was clear what it was good for.


My point is that QC-s have the potential to be as revolitonary as transistors were. Yet we are at the level when transistors were mostly used to make radios portable. Thus it's not really justified to mock them as lame factoring devices.


That's not how I read your comment. It clearly tries to suggest that when the transistor was first invented people did not know what to do with it.

I think the MASER/LASER would have been a far better example.


Perhaps a Leyden jar, or Aeolipyle. Novelties with great potential. Or perhaps, they're just a drinking bird kind of novelty - i don't think so, but maybe. QM is rock solid, but it doesn't tie together well with our other theories about the world. Maybe knowing more will make the error correction easier somehow.

I kinda think we're in the difference engine phase of quantum computing. Same problems as Babbage building a neat thing, but we're missing some details and our tools aren't really sharp enough to make a good one.


I read that somewhere (can't recall the source) that it was a nice gadget, but it wasn't an overnight sensation. You are right, MASER/LASER is a better analogy.


If you are talking about transistors, when they were invented the only other alternatives were triodes (vacuum tubes) or mechanical relays. Both were orders of magnitude less reliable and slower in terms of switching speed. When transistors were invented it was a mad dash to get them cheap and small enough to replace everything else, because everyone could tell they were the future. It took 9 years between the first transistor and receiving the Nobel Prize for it.


I don't know where you read that but that is not the story that I'm familiar with.

Applications for the transistor were known before the device even existed.

The problems were first to make them reliably, then to make them in quantity and finally to make them cheap enough. All those were overcome and the revolution that followed has not stopped even today.


No, first customer of the transistor was the government because they were the only ones who could afford it.


Yes, and the reason the government rushed to it is because they wanted the absolute best technology (that was orders of magnitude better than anything else) and would pay for it to have a leg up over their enemies. If they were cheap right when they came out, everyone would have bought them.


The point where it gets interesting for realistic physics and chemistry applications is around 100 (error-corrected) logical qubits and 10^8 coherent operations, see for example https://arxiv.org/abs/1510.03859.

The error correction adds another factor of at least 100 or so in both qubits and gates needed (but possibly much bigger than 100, depending on qubit quality), see for example https://arxiv.org/abs/1312.2316.

Other fields of application - factoring large integers, for example - takes many many more qubits to be interesting.

While it's good to get people excited about the potential of quantum computing, it's seems a bit disingenious to suggest that a 17-bit quantum processor is commercially interesting. I especially like how they juxtapose it with the publically available 16-bit quantum processor to make it seem like one extra qubit makes it worth paying money...


I dunno. How many qbits one would need to improve DFT chemical calculations?

You don't need too much precision to determine if a chemical reaction will happen or not. You do need many operations, but not many bits.


Don't forget cryptography.


That's covered by "factoring large integers", I do believe.


There is more to crypto than that, right? The discrete logarithm problem is independent of integer factorization.


1. Do they claim that the backend is actually a quantum computer? Or are they just providing a quantum computing like interface, which is backed by a classical computer emulating a quantum computer?

2. Looking at their Terms Of Service (excerpt below) it's unclear whether you share rights any model that you try in their playground?

<i>"IBM does not want to receive confidential or proprietary information from you through our Web site. Please note that any information or material sent to IBM will be deemed NOT to be confidential. By sending IBM any information or material, you grant IBM an unrestricted, irrevocable license to copy, reproduce, publish, upload, post, transmit, distribute, publicly display, perform, modify, create derivative works from, and otherwise freely use, those materials or information. You also agree that IBM is free to use any ideas, concepts, know-how, or techniques that you send us for any purpose. However, we will not release your name or otherwise publicize the fact that you submitted materials or other information to us unless: [...]"</i>


How is it unclear? It seems pretty explicit from the passage you quote that they can take anything you upload and do whatever they want with it.


I know next to nothing about quantum computing, but something about this seems extremely fishy. Had they developed a working quantum processor, would they not have publicized its capabilities far and wide before releasing an enterprise API? Failing at that, would there not be a benchmarks page that demonstrates the capabilities of this system? Could someone better acquainted with this technology weigh in?


It's mostly marketing at this point.

Even if their quantum computers work, at the scale they operate today everything they do will also be doable by simulating a quantum computer on an average PC with some slowdown. It's just "now we can do it on a real quantum computer".

They would have to rapidly increase the bit number to produce anything that's useful. That may very well happen in the future, but nobody knows for sure.


Indeed, someone spends what AFAICT is a very small amount of his time debunking every last claim that any given quantum computer does anything straightforward simulated annealing can't do.


Not any given quantum computer -- Scott Aaronson debunks the D-Wave "quantum" annealers that aren't actually shown to be harnessing the quantum power.

IBM's machine by contrast is a genuine quantum computer -- the kind Scott Aaronson would probably have no problem with. But the number of qubits is too tiny to do anything interesting.


Scott Aaronson?


Sounds right.


> While technologies like AI can find patterns buried in vast amounts of existing data, quantum computers will deliver solutions to important problems where patterns cannot be found and the number of possibilities that you need to explore to get to the answer are too enormous ever to be processed by classical computers.

Oh what a load.


There's some pretty cool problems in quantum systems like chemistry that even a classical supercomputer would not be able to solve This could have a very positive impact on fields like medicine


I am not saying what quantum computing is or isn't doing. Contrasting quantum computing and AI is like comparing... digging a hole and shovels. "While digging a hole is useful, these shovels are even better than hole-digging!" It's nonsense sauce. If quantum computing can be useful at AI tasks, then quantum computing won't replace AI, we'll be doing AI on quantum computers. This is just marketing mumbo jumbo to convince some C-suite type that doesn't know a thing about data science to say, "Why are we still using AI and not this quantum thing IBM has? Quantum is better than AI!"


It's a little funny that you're defending the integrity of calling the status quo "AI", don't you think?


Yes, yes, real AI died in the AI Winter because the settlers didn't have enough parenthesis to last until spring, it was all very tragic, and now we're all just stirring the pile of linear algebra until the results look good. [1] But that doesn't make IBM's marketing copy any better. And IBM loves making grandiose claims that don't work out in practice. [2]

[1] https://xkcd.com/1838/ [2] https://www.healthnewsreview.org/2017/02/md-anderson-cancer-...


Their product marketing surely is in a quantum state.


What interesting computation can you do on a 16 qubit processor?


you could factor an 8 bit number really fast heh


I think it's more like a 6 bit number, and probably not very fast at all...


That's been available for months. Has anyone used it?


A five-qubit processor has been available for a bit over a year now [1]. Last week, they've announced that they'll make freely available a 16-qubit processor, but for now it is invite-only. They've also announced that they'll sell access to a 17-qubit device.

[1] https://arstechnica.com/science/2016/05/how-ibms-new-five-qu...


I was listening to a podcast yesterday where Google is basically "guaranteeing a breakthrough" in QC by the end of the year.

Something about the 49-qubit threshold or some such proclamation, so maybe that's something that will finally move the needle.


The Google group in Santa Barbara is building to a 49-qubit computer. If I understand correctly, it'll be interesting for understanding how you calibrate your quantum machine and validate that it is doing what it is supposed to. The algorithms they run on classical computers to validate that device will need a decent amount of supercomputer time. And much bigger devices would be impossible to validate with a classical computer in the same way, so in some sense you hit "the limit" of classical computation and start to move beyond.

But for the computational problems that are useful for applications, which are not very much like the problem they use for validation, 49 qubits is still far far away from beating a classical computer.


Was it this Podcast? http://www.bbc.co.uk/programmes/p052800h

"IBM is giving users worldwide the chance to use a quantum computer; Google is promising "quantum supremacy" by the end of the year; Microsoft's Station Q is working on the hardware and operating system for a machine that will outpace any conventional computer. Roland Pease meets some of the experts, and explores the technology behind the next information revolution."


Yep...that was the one...further on in the broadcast the researcher from Google explains that for "Quantum Supremacy", that "49-qubits" are enough to "leave the fastest classical computer in the dust" or some such hype.


The explanation videos are very fluffy. Anyone find any technical overview articles?


> All of this sophisticated engineering makes [the 17 qubit processor] at least twice as powerful as the [16 qubit processor].

Isn't an n+1 qubit processor always twice as fast as an n qubit processor?


Looking at the SDK, the programming language of the future is Python.


16 qubits? How hard is it to simulate those 16 qubits with a regular computer? I get this is marketing but people would be better off just running a simulator at this point.


With 16 qubits, the Hilbert space has 2^16 = 65536 elements. For numerical simulation, you need to store the wavefunction amplitude for each of these elements, usually using a double-precision complex number (16 bytes). Altogether the wavefunction can be stored as a vector with 1 MiB of data. AFAIK all quantum computing algorithms can be expressed as a sequence of 1- and 2-qubit operations, which would be implemented as sparse matrix-vector products. So the simulation part is almost trivial. ;) I'd say that for 16 qubits, designing the quantum circuit to perform an actually useful operation is the far more difficult part.

However, once you start adding more qubits the simulation part becomes hard to infeasible. I think I saw some paper which simulated around 40 qubits. Here the wavefunction already needs 16 TiB, so you need a big HPC cluster to run that.


It's pretty easy. If your computer has enough ram to store a size 2^16 length complex vector (which is 2^20 bytes, or 1MB) than you can open up an ipython notebook and write code to apply quantum gates to it with no problem.

The problems start to set in if your RAM can't hold the wavefunction in memory (so around 28 qubits, which takes 2^32 bytes = 4GB of RAM.)

With specialized code and supercomputers you can get a little farther, but you will be fighting exponential growth, so not too much. The practical limit for classical computers is in the 40-50 qubit range.


> If your computer has enough ram to store a size 2^16 length complex vector (which is 2^20 bytes, or 1MB)

One megabyte was enough even for quantum computing! Wow. Bill Gates, what a visionary ;-)


2^16 hard ...


IBMs Mo seems to be jump onto new technologies with name recognition. Make BOLD claims they can't back up. See Watson. See Q. See IBM Blockchain.

This only serves to build distrust with developers but I bet their investors love it, but only since they don't understand it


It seems that IBM is really struggling to stay connected to reality, let alone run a profitable business.


Are you suggesting IBM is not profitable?


They seem to be running on inertia at this point. I haven't seen any plausible new enterprises from them. It's possible I'm looking in the wrong places, though.


Fully agreed, but they've been running on inertia for the last two decades, but that has nothing to do with profitability. They're still printing money, every quarter and are paying out substantial dividends.

Obviously that can't go on forever and they're desperately searching for a path to a viable future but for the moment they are definitely in the black and will - as far as I can see - stay there for quite a while to come.


This is horribly reminiscent of the marketing programme for IBM Blockchain.


Amazing this has not been renamed to Watson.


I'm glad it isn't, it really waters down both brands...


wow, thats right.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: