The system is clearly not "live and in use" without its dilution fridge and thermal radiation shields.
1 days ago [-]
rezmason 1 days ago [-]
Just as long as we don't observe it reeeeally closely, I imagine.
teleforce 24 hours ago [-]
>The computer is (said to be) live and in use by companies, so cryogenic cooling keeps the system temperature as close to absolute zero as possible to conserve that precious quantum state.
Before LLMs/AI became the obvious "next big thing in computing" I remember coming across a fair number of opportunistic devs on LinkedIn trying to promote themselves as "quantum software engineers", and even a just a year or two ago I would see "quantum machine learning" on people's profiles. I remember thinking maybe I had missed something and seeing how many qubits we could even have... needless to say it was (and still is) not enough for any meaningful ML work quite yet.
If you search you can still find some, and, as someone who has spent more than a decade doing actual machine learning, I find the audacity to claim that you're doing any kind of serious software engineering, let along proper ML work, on a quantum computer to be almost impressively audacious.
amitav1 22 hours ago [-]
As a fellow currently dipping my toes into quantum machine learning, I think that you think we're saying "machine learning on quantum hardware", when what we actually mean is "machine learning for quantum computing on classical hardware". That is, using machine learning on classical computers to try to increase the effectiveness of quantum hardware.
volemo 20 hours ago [-]
I know a couple of quantum software engineers and these people are in universities writing novel algorithms on whiteboards (and sometimes testing them out in QuPy).
georgeecollins 19 hours ago [-]
As I understand it Quantum algorithms are very much needed as the hardware improves. The hardware gets shown off, but there aren't a lot of algorithms that can take advantage of it if it worked better. Yet.
volemo 17 hours ago [-]
As I understand it, the hardware very much lags behind the algorithms. We have plenty of cool algorithms to run on quantum computers, but to be practically interesting they all need more qubits or better coherence times than is available today.
NedF 22 hours ago [-]
[dead]
potato3732842 15 hours ago [-]
Probably just marketing wank, but I got a chuckle out of "it’s not likely to be something you’ll ever have at home" as if we haven't all heard that before.
TriangleEdge 21 hours ago [-]
This seems like a PR stunt to me. Now I am wondering if there was similar news about transistors.
wcallahan 17 hours ago [-]
I suspect I’m not alone in pausing around the statement:
> "It’s not likely to be something you’ll ever have at home"
I’m curious… what would need to be true to make this statement wrong?
fancyfredbot 13 hours ago [-]
We'd need to know how to build a useful quantum computer, find a useful algorithm to run on it (factoring large primes lacks broad consumer appeal), and use this demand to fund research into a way to reduce manufacturing and running costs to reasonable levels.
Alternatively we all move in to science labs.
pjs_ 1 days ago [-]
Dil fridge will get a bit hot with its clothes off like that
MengerSponge 21 hours ago [-]
Either IBM has cracked optically transparent coatings that cycle from 300 to 1K repeatedly and acrylic that has a metal's thermal conductivity, or it's a sham.
Really hard to tell which it could be.
volemo 20 hours ago [-]
> optically transparent coatings that cycle from 300 to 1K repeatedly and acrylic that has a metal's thermal conductivity
I believe that still wouldn’t work because optical part of the spectrum carries thermal energy too.
MengerSponge 7 hours ago [-]
Oh hush, you're not going to nerd snipe me into doing the thermal flux calculations today.
Optical photons don't carry an impossible amount of energy: I've seen liquid helium through a small coated window. The window was there for ion beam purposes, not "entertaining the grad student", and it was a big element in the heat budget!
This is consistent with IBM's history of putting computers doing customers' work on display. I am aware of the company doing so in New York and Toronto.
andrewxdiamond 1 days ago [-]
They also have one displayed at the Cleveland Clinic main campus _cafeteria_
Imo focusing in “showing off” instead of “providing value” is a bit of a product-smell. Maybe thats just the point tho, IBM seems to prioritize impressing C-suites over actually accomplishing anything
mikeyouse 1 days ago [-]
It’s not unheard of in the medical realm. Slightly different but when Intuitive Surgical released their DaVinci robotic surgery platforms, a hospital system I worked with was early on their list. They also set up the demo unit in the cafeteria so you could see surgeons peeling oranges and then stitching them back up or what not.
sanswork 1 days ago [-]
Back in the early 2000s I worked for Cap Gemini in Birmingham England which had a part of the office that was some sort of partnership with IBM GS(I think IBM did the hardware and cap got the services contacts). They also had a big blinkin lights server setup in the middle of the office for clients to see. As a teenage geek in his first tech job I used to love going to peek at it even though I did tape rotation on the real servers in the basement most days.
RealInverse42 12 hours ago [-]
And can it observe the observers?
t1234s 21 hours ago [-]
reminds me of computers in 80's/early 90's scifi movies
carabiner 1 days ago [-]
This could cause a resonance cascade.
MaintenanceMode 21 hours ago [-]
Or can you!?
seeknotfind 21 hours ago [-]
"It’s not likely to be something you’ll ever have at home" Pessimistic much?
Rendered at 01:45:59 GMT+0000 (Coordinated Universal Time) with Vercel.
But can it factors 21? [1]
[1] Why haven't quantum computers factored 21 yet?
https://news.ycombinator.com/item?id=45082587
If you search you can still find some, and, as someone who has spent more than a decade doing actual machine learning, I find the audacity to claim that you're doing any kind of serious software engineering, let along proper ML work, on a quantum computer to be almost impressively audacious.
> "It’s not likely to be something you’ll ever have at home"
I’m curious… what would need to be true to make this statement wrong?
Alternatively we all move in to science labs.
Really hard to tell which it could be.
I believe that still wouldn’t work because optical part of the spectrum carries thermal energy too.
Optical photons don't carry an impossible amount of energy: I've seen liquid helium through a small coated window. The window was there for ion beam purposes, not "entertaining the grad student", and it was a big element in the heat budget!
Theres even a python package called quisket.
https://quantum.cloud.ibm.com/
Imo focusing in “showing off” instead of “providing value” is a bit of a product-smell. Maybe thats just the point tho, IBM seems to prioritize impressing C-suites over actually accomplishing anything