Valuation over value.
It's insurance money. If you're a manager of a big company like IBM, Microsoft or Google, you have to align your current product portfolio and future portfolio in such a way that shows your investor that your company will keep growing, even if your current products are stagnant.
You can surely say Quantum computing won't do much in next 5 years. But what about 10 years? 20 years? 30 years? The farther you look into the future, the bigger the probability of having a huge tech breakthrough that could give the company who has it a massive edge on the market.
Even if you have a chance of 1% of having a sort of transistor revolution from QC, it becomes a race to the bottom. If Google starts researching it, IBM will follow suit, and so will Microsoft. If in 30 years this turns out to be a big deal, no one will be 30 years behind.
Picking a company to invest in only half the job, choosing how much to invest is the other half.
It's a terrible way to invest if you put all your money into it. QC "changing the world" is a tail-end event. You allocate according to the risk.
Almost if I were reading about cryptocoins.
But the big research plays (bell labs, xerox parc) seem to get less and less funding if they exist at all. A lot of the inventions of those places were monitized outside those companies. IBM had a chip fab in the research building… long spun of was that business.
At the turn of the century IBM was researching quantum computing, but as I was leaving selling services was IBMs big push.
Forty years ago they were the best game in town for applied research (defining applied as, on success, having a fast track to commercialization). Later it became similar to the university research: on success, you write some articles, get company wows, but business people have no idea where to stick it and half heartedly throw a few applications at it to see if any stick. Most don't (e.g., deep blue, watson).
At this point large company R&D centers got passed over (by a lot) by VC- funded applied research and saw a (IMO well deserved) drop of funding.
Okay, integer linear programming problems .... To get all excited about quantum computing (QC), need to get excited by the big money to be saved by solving all those important, practical ILP problems.
Okay, I had a good background in pure/applied math and in computing and got into ILP for scheduling the fleet at FedEx. Since the promised stock was 1+ years late, I ran off and got a Ph.D., in one of the best programs, in more in hopefully useful pure/applied math, and much of that work was in ILP.
Here is some blunt truth about the NP-complete problems and the cartoon at the beginning of the famous book by Garey and Johnson: The math guys were talking to their manager explaining that they couldn't solve the manager's problem but neither could some long line of other math guys.
Here the blunt part is the meaning of "solve" -- with a computer program running in time only a polynomial in the size of the problem get an optimal solution to any instance of the problem including the worst cases. And here optimal means down to the last penny to be saved. So, for some network deployment by AT&T that was to cost $1 billion, save down to the last penny, in polynomial time, including for the worst case instance of the problem.
Yup, maybe the savings would be $51,937,228.21. And do want to save that last penny. But if the manager would settle for saving just the first $51,900,000.00 in reasonable computer time for all or nearly all the actual instances of the manager's real problem, then there would be little or no difficulty. And should be able to tell the manager that savings of more than $55 million, or some such, were impossible -- that is, have an upper bound.
So, much of the difficulty was saving the last $37,228.21, guaranteeing to do so, for all instances of the problem, including the worst cases.
Well, I can assure readers that should I have insisted on a career saving, e.g., $51,900,000.00 where savings of $55 million were impossible, then I would have spent the last several decades homeless on the streets or dead from homeless on the streets -- no joke.
Bluntly, there just is no significant demand for solving ILP problems in practice. The "managers" don't want to get involved.
Selling pizzas from the back of a truck? Sure -- might sell 100 pizzas a day. Selling solutions to ILP and other NP-complete problems -- f'get about it.
Uh, since there is no significant demand for saving $51,900,000.00 with a bound of $55 million, there stands to be not significantly more demand for saving $51,937,228.21.
Thus, there stands to be no significant value for QC for solving NP-complete ILP problems. Sorry 'bout that. If some people want to get the $51,900,000.00 savings, they've been able to do that for decades and have voted loud and clear "We don't care.".
E.g., in one of my attempts, a guy sent me an ILP problem, we talked, and two weeks later I had running code that in 900 seconds on a slow computer got a feasible solution guaranteed to be within 0.025% of optimality. The problem had 600,000 variables and 40,000 constraints. I had done the work for free. Still, then, suddenly he was not interested.
So be it.
There was another one: I was writing the code using the idea of a strongly feasible basis, and suddenly the customer was not interested and returned to some not very good heuristic code he had.
Better, a lot better, to sell something a lot of people actually want, e.g., a lot better to sell pizza.
And I am doing a startup that to me continues to look good, software running, but it has nothing to do with NP-complete or ILP and wouldn't be helped by QC.
So, to me, e.g., even if Google gets a good QC that can solve ILP problems, then I don't believe that they will have many customers or much of a business and there will be no big reason for IBM or Microsoft to worry.
Since there is no significant demand for using ILP to save money now, I don't see a significant demand for using QC on ILP to save money in the future.
Their employees might be better off selling pizzas. Let's see: From some of my arithmetic about costs of pizza, can do well for $2-3 a pizza. From a pizza truck in a good location might be able to sell the pizzas for an average of $10 each, e.g., an extra $1 for anchovies! Might sell 100 pizzas a day for $1000 a day, maybe 20 days a month. Looks like a better career than QC research!
If there is no demand for pizzas, then there won't be much demand for pizzas with anchovies.
Uh, the Google QC researchers are well paid? Terrific -- park the pizza truck near the Google QC research building!!!!
For some parts of US national security, the situation for a good QC might be significantly different -- I doubt it, but maybe.
Negative, positive outlook is that it is a disinformation campaign so one may maintain the lead in a particular trajectory of technical dominance. Whilst doing so, as an extra game theoretic safety precaution which also amplifies the disinformation campaign is to fund any research in the direction of the disinformation campaign as both a distraction and 'impossibility canary.'
Quite... deliciously deceptive.
"The reality is that none of these companies — or any other quantum computing firm, for that matter — are actually earning any real money."
I don't see any argument that the technology is fundamentally unsound or doesn't scale, even though that's an argument I'm pretty amenable to.
2. It is worth mentioning that by the standards of your comment, the time between conceiving of a classical computer (Babbage) and a scalable electronic computer (ENIAC and family) was about a century.
3. While ultimately there might be a "quantum winter" in the next few years because we (I work in the field) overpromised, this would not be the first time a tech that ultimately works gets disregarded for a decade or two because of mismanaging expectations (e.g. Liquid Crystal displays or neural networks, which were both developed for many decades before being commercially viable).
EDIT: And yes, there are some startups with misleadingly general pitches.
If you're thinking that the whole purpose of QC will be quickly subsumed by wide algorithms with superpolynomial speedup, you might be missing the point. It's about how computers are built, not about stuffing one specific abstraction into another. If suddenly we discover we can build a machine that can generate random numbers a quadrillion times faster than any current hardware design, that's a new space in computation.
I mean consider how widely deployed the parallelism construct is now, and that Amdahl's law was elucidated in the 60's.
Parallelism was just one degree of freedom for us to climb the S-curve on, quantum computing seems to provide essentially a continuum of them.
Why people keep repeating that? You mean that if somebody creates a machine that can simulate chemistry and materials science in polynomial time, nobody would use it? That's crazy.
We are building user interfaces that make it easier to “play around” with quantum computing phenomena—especially with music and art—with the idea that our aesthetic sensibilities may help drive discovery.
How is it even remotely close?
Vacuum tubes were a thriving industry, producing many groundbreaking products and services.
Vacuum tubes are analogous to the computers that we have now. In 1925 a patent for the concept of a FET was filed. It wasn't until 1948 that we had a working transistor.
That took 23 years to go from concept to useful invention. It isn't too surprising that a quantum computer is harder.
Not that quantum computing is at the same stage as the vacuum tube industry was at at some unspecified time.
So far, quantum computing is used in labs, not for any real useful purpose.
https://quantumdelta.nl/ is some kind of hub but landing pages offer too much hype and too little content :)
thank you.
"Separating Quantum Hype From Quantum Reality" - https://news.ycombinator.com/item?id=32691220
An article with a little more depth might examine the future of trapped-ion quantum computing, for example:
https://en.wikipedia.org/wiki/Trapped_ion_quantum_computer
As far as the 'make money off new drugs' mentality, that's not really where QM chemical simulations in molecular dynamics really seems all that promising - it's more about things like the design of new catalysts to improve the efficiency of various industrial processes.
If QM computation is eventually developed, the devices will almost certainly be large and extremely expensive (kind of like the cutting-edge chip fab machines of today in scale). For most businesses, it's unlikely the benefit of owning one will justify the cost, so it'll probably be a national lab / research center type thing.
The key mechanism to protect inventions is patents. Patents have a limited shelf life. If you file a lot of patents today and it takes 30 years before you can apply them, they will have expired by then and others are free to take your inventions and build on that. So, if quantum computing requires another three decades to start making money, most of the companies that are currently being invested in will have failed and their patent portfolios and investment will be worthless. Their patents will have expired, their founding scientists will have moved on or retired, etc. At best those companies may be in a position to file more patents. So, any investors investing right now are making bets on how long it will take before there's a meaningful market to get an ROI and which companies are positioned best to take a chunk out of that market. The further that is out, the higher the risk of losing their investment.
There are billions flowing into quantum computing and the article is simply making the point that in terms of revenue potential there seems to be a lot of uncertainty about the practicality of current approaches, the lack of any real revenue (beyond consulting people on how awesome it would be if we had working quantum computing, etc.). And the lack of perspective on when all this will change. Very valid points. There are a few big companies investing in this stuff but none of them is betting their company on it. It's a side show at MS, Google, IBM, etc.
A long shot that might create some viable business decades further from now but if it all fails, their stocks will be fine. There's enough substance there for them to want to have a finger in the pie if it does take off but none of these companies seems to be counting on that happening any time soon.
The physics underpinning qc was arguably proven in the 2020's. It's not quite done (in the way that fusion was not quite done in the 50's) but there is a fairly clear set of demonstrations that QC's with error correction are possible. However the engineering barriers are fierce and there is still a possibility that they are insurmountable. In addition there are concerns that while QC will work the class of problems that is NP and also BQP may be very small. Even if a problem is in that group then it may be that the algorithms we have are not superquadratic or quadratic - meaning that the improvement that they offer over classical algorithms may be marginal.
Worse, there are often very good heuristic approaches to some of these problems which means that although a superquadratic QC approach would be an amazing breakthrough of computer science (genuinely amazing and worthy of accolades and prizes and fundamentally important for our understanding of the universe etc) it would offer only marginal economic value (possibly). Now, this is not true of some problems where there are exponential explosions and no good heuristics... but there is an even worse catch.. Which is that the quantum algorithms offer computer scientists fresh insight into what's tripping up the classical approaches. In this scenario it can be that an amazing breakthrough happens in QC, and someone uses that to get an insight that pushes the classical approach close enough to the QC approach as to render the QC approach marginal.
The theoretical picture is moving very fast though - so we will have to see.
On the other hand the practical side is moving more slowly. We see announcements that make one think that a Moore's law type of scaling is happening but hidden in the small print there are often (always as far I can decode) catches that mean that while the results look great they are still very much mired in problems. For example, are all the bits on a QC useable at once? Can they be used to form an actual algorithm? How long does the machine run for? How long does it take to start? Some of the answers are jarring - often only a small subset of a machine can be used in an actual problem solving episode; sometimes the machines run for a few steps only; sometimes the machines take 24hrs or longer to start.
It has taken 70 years to nearly build fusion reactors, it took 70 years to create mRNA vaccines. It may well take 70 years (from now) to build practical, valuable quantum computers. And something could go wrong on that path that just renders them moot.
Furthermore, the paper itself links to a github repository[2] with a list of papers that either imply or use an exponential advantage in quantum chemistry. Now would be a good time to mention that I am not an expert in chemistry, nor have I read the entirety of this list of papers so I am not in a position to go through each and every one to decide how generic their results are or what the limitations are. Perhaps all these papers have fundamental limitations that prevent it from being useful in normal chemistry, only in weird souped-up problems specifically devised for a quantum advantage.
Either way, this paper is by no means conclusive on the subject. There's a ton of more research to be done in multiple fields to know for sure.
[1] https://arxiv.org/pdf/2208.02199.pdf [2] https://github.com/seunghoonlee89/Refs_EQA_GSQC
I can provision 1k CPU based servers or ~20 4x GPU based servers in a cloud computing environment for an hour for <$400. These are mature technologies with massive economies of scale behind them. A quantum computer needs to not only outperform scale out GPU/CPU performance on a particular problem set, it needs to crush it.
Hmmm. I'm no mathematician; but I thought the value of an "exponential speedup" is if you are trying to solve a problem with "exponential complexity".
I don't know if "exponential compexity" is a thing; I'm pretty sure "exponential speedup" isn't. Is it correct to say that a quantum factoring machine has an "exponential speedup"? Isn't it more accurate to say that the exponential difficulty is a property of the classical algorithms, not of the problem itself?
In other words the size of funding QC is getting is nowhere close to the other hype bubbles and there are some significant peer-reviewed results that have been generated from it, so for the time being you can still give it the benefit of the doubt.
For example it has definitely enhanced our understanding of quantum chemistry and computational complexity, and anyone who invests time learning QC will end up having solid new insight about how the world works and deep engineering knowledge of electronics, which you can't say about many other bubbles.
For example, compare how many QC startups YC has funded (I think 0?) compared to blockchain, crypto, AI-assisted medicine and web3. There is no comparison. Picking on QC is far below my list if you want to have a go at hype bubbles.
>"The little revenue they generate mostly comes from consulting missions aimed at teaching other companies about 'how quantum computers will help their business,'"
well, that makes QC bona fide a tech industry.
In the case of the Internet Bubble stocks were down 78%, but it was not hard to do well in the end given diversification and a long enough horizon.
In the case of the Dutch Tulip bubble there was no good ending for anyone except those who got out early.
Some bubbles like NFTs generate strong opinions but have yet to have final judgment from history.
I think the quantum computing bubble is different than all three, but closer to the Internet than to Tulips. In which case the conventional strategy would be to diversify and expect a long time horizon.
This is untrue. If you were invested in what became the stars of that era: Amazon, Red Hat, Cisco, a few others, you eventually made decent money, although far worse than if you had stayed out and bought the dip.
If you had a diversified portfolio of 'new economy' stocks which didn't include a few winners like this, you might have lost over 95% of your money and never got it back. Lots and lots of stocks simply disappeared or were bought for peanuts. Many others, including lots of very very highly rated ones like Yahoo never exceeded their bubble-era peaks.
By diversification I’m assuming a NASDQ index fund, which many of the hot new Internet stocks, as well as larger establishes tech companies benefiting from bubble were part of.
If you invested in NASDAQ everything at the absolute worst peak of the bubble: - The initial crash put you at -78% return - It took 21 years to recover all loses and earn a 300% return.
Why do you think that’s not “doing well” for a index fund closely tracking the bubble?
You could say alternative scenarios would’ve done better but that’s always the case.
The main point is, for someone with a long time horizon who was diversified, this turned out way way better than a lot of other bubbles turned out.
And that's it! The author of this article is 100% right. Markets are fully aware though, go ahead and try to short any publicly traded QC stock lol. You can't. There's no shares to borrow and no liquidity on puts....
----- The reality is that none of these companies — or any other quantum computing firm, for that matter — are actually earning any real money. ----- ORLY? I guess I should go masssively short IBM shares then. https://newsroom.ibm.com/image/2022%20IBM%20Quantum%20Roadma...
---- Shor’s algorithm has been a godsend to the quantum industry, leading to untold amounts of funding from government security agencies all over the world. However, the commonly forgotten caveat here is that there are many alternative cryptographic schemes that are not vulnerable to quantum computers. It would be far from impossible to simply replace these vulnerable schemes with so-called “quantum-secure” ones. ---- ORLY? New cryptography can take 20 years or more to be fully deployed to all National Security Systems. NSS equipment is often used for decades after deployment. National security information intelligence value varies depending on classification, sensitivity, and subject, but it can require protection for many decades. -NSA
The solutions we do have do not work very well. Only the weakest FALCON-512 (bad name as it was only 64 bits of quantum security, now the dual lattice attack seems to reduce this to 20?), actually fits the TLS use case without breaking the internet. The signatures are just too big. Cloudflare has testing that proves this.
If that wasn't enough, this person is completely unaware of the annual survey of quantum researches that actually puts the arrivial of a cryptanalyically relevant quantum computer at 2030 or so. Peter Shor is actually one of the people polled in the survey, this person is not. And if you parts are still clean, you can look at the surveys estimates since 2018. These estimates are clearly trending towards sooner and sooner, instead of further and futher away.
If you still have doubts, read this: https://www.whitehouse.gov/briefing-room/statements-releases...
Quoting:
> Billions of dollars have poured into the field in recent years, culminating with the public market debuts of prominent quantum computing companies like IonQ, Rigetti and D-Wave ... These three jointly still have a market capitalisation of $3bn, but combined expected sales of about $32mn this year (and about $150mn of net losses), according to Refinitiv.
> The reality is that none of these companies — or any other quantum computing firm, for that matter — are actually earning any real money. The little revenue they generate mostly comes from consulting missions aimed at teaching other companies about "how quantum computers will help their business", as opposed to genuinely harnessing any advantages that quantum computers have over classical computers.
Well, we do know that P <= BQP <= PSCPACE, and we have one important example that lies in BQP but not in P (for all we know). It's just not clear how important that particular example is for the kind of computing we do today, if it ever becomes practical. It looks like it'd rather result in a one-time nuisance for sysadmins, like Y2K was.
The hope was for applications in new areas like materials and drug design. The author has posted a link to one paper suggesting that we might not see exponential speedups in chemical simulations, but that's not an outright refutation either.
A real performant quantum computer could potentially revolutionize a lot of industries. But selling it as an accelerator of molecular dynamic simulations is not quite as sexy.
https://scottlocklin.wordpress.com/2019/01/15/quantum-comput...
Operator Imprecision and Scaling of Shor’s Algorithm
I further posit that there are no quantum algorithms without binary equivalents.
Because i find it really hard to believe there will ever be an O(sqrt(n)) classical algorithm for unstructured search. How could there possibly be?
[Edit -- Additional considerations] In order for Grover's algorithm to work, you have to implement your conventional algorithm in Quantum hardware, with perfect fidelity. Digital computers can do this just fine because every gate is also a comparator, so signal to noise ratio isn't an issue. I fail to see how a quantum gate can possibly operate with enough fidelity to even just copy the input state after 2^64 stages, let alone complex logic AND the Grover Diffusion Operator.
On utility, there's more than just Shor's: unstructured search [1], finance ([2], [3]). Even if quantum computers ultimately prove unfruitful commercially, that doesn't render it a useless endeavor. Like String Theory, it can beget findings in other areas, regardless of whether you can profit from them: novel classical recommendation algorithms ([4]), quantum algorithms for SAT that could possibly help automated theorem proving ([5]).
Part of the difficulty of quantum computing is that to show speedup, you need to find complexity bounds on classical problems whose runtime is actively being researched, e.g. neural networks ([6]).
As for their financial worthwhileness, while there is valid concern ([7], [8]), it's far too early to tell: it's hardware, not software. Also, it's my understanding that private investment is much larger than public funding in the US for quantum computing, both of which pale in comparison to China's investment. Thus, I wouldn't want to see investors shy away if the government is unwilling to make up the difference!
[1] - https://en.wikipedia.org/wiki/Grover%27s_algorithm
[2] - https://arxiv.org/abs/1905.02666
[3] - https://arxiv.org/abs/1908.08040
[4] - https://scottaaronson.blog/?p=3880
[5] - https://cstheory.stackexchange.com/questions/36428/do-any-qu...
[6] - https://arxiv.org/abs/1912.01198
[7] - https://www.microsoft.com/en-us/research/project/topological...
> The most prominent application by far is the Shor algorithm (opens a new window)for factorising large numbers into their constituent primes, which is exponentially faster than any known corresponding scheme running on a classical computer. Since most cryptography currently used to protect our internet traffic are based on the assumed hardness of the prime factorisation problem, the sudden appearance of an actually functional quantum computer capable of running Shor’s algorithm would indeed pose a major security risk.
> Shor’s algorithm has been a godsend to the quantum industry, leading to untold amounts of funding from government security agencies all over the world. However, the commonly forgotten caveat here is that there are many alternative cryptographic schemes that are not vulnerable to quantum computers. It would be far from impossible to simply replace these vulnerable schemes with so-called “quantum-secure” ones.
Note that Shor's algorithm breaks not just factoring, but also discrete log, including elliptic curve discrete log. That includes classic DH and DSA of course, as well as ECDSA and ECDH, whether they're over Bitcoin's curve, the other NIST curves, Brainpool, {curve,ed}{25519,448}, pairing-friendly curves, everything. Almost all broadly deployed public-key crypto uses RSA or elliptic curves. Those alternative public-key algorithms are still being worked out, and will take years to broadly deploy, so if a QC gets built, it will probably be able to break into straggling systems for some years. There is also a risk that the replacements will eventually fall to quantum or even classical attack, especially considering that a significant fraction of the proposed replacements already have fallen (most recently SIKE) or been weakened (eg, every multivariate quadratic sig). They may also have other security problems, eg implementation bugs or side-channel attacks.
The surviving quantum-secure algorithms are all either pretty inefficient (McEliece and SPHINCS+, and CSIDH and SQISign but those are also bleeding-edge), or use structured lattices (Kyber, Falcon, Dilithium, NTRU and NTRU prime, etc) or structured codes that look kind of like structured lattices (BIKE, HQC). So we'll have most of our eggs in just a couple of baskets again, and outside of applications that can use McEliece and SPHINCS+, they'll be newer, less-tested baskets. Also, while fast, the structured lattice and structured code systems still use significantly more bandwidth that elliptic curves.
Using long-term symmetric keys instead of or in addition to public-key crypto is possible in some applications, but it's obnoxious and limiting: you'd end up with some combination of Kerberos derivatives (with trusted third parties acting as single points of security failure), mailed smartcards or other secrets, and physical in-person meetings to set up shared keys.
So the bigger issue in my view is that outside of Bitcoin, breaking crypto is mostly a net negative for society. Transitioning to quantum-secure crypto is also a negative, in that it will take a ton of work and the replacements are less efficient than elliptic curves, and may have security problems. (It's also probably unavoidable because governments will try to build QCs to break crypto even if private industry doesn't.) So all this money is being spent on something whose first major application will be negative, if it even works at all. Hopefully the positive stuff will outweigh this.
To write an article for the site, we would need to:
1. Write a headline with no mentions of any experts.
2. Write another headline mentioning at least one expert in it.
3. Write the content without mentioning any experts.
4. Write the content and sprinkle names of experts as needed.
5. Publish.
Now, the reader would then:
1. Be exposed to the no-experts version of the article - both headline and content.
2. Once finished, the reader will be prompted to write their thoughts on the article.
3. Click “Reveal”.
4. The reader would then skim or read the whole article again, but this time it would mention the experts.
5. Prompt the reader to evaluate how their thoughts had changed after reading the expert version of the article.
I’m so gullible, seeing experts in anything especially when names of prestigious institutions or titles are tacked onto them, tend to shut down the reasoning part of my brain altogether.
Bear in mind, the site I proposed is not a place to police how articles should be written; rather, it’s all about increasing its readers’ awareness on how much mentions of an authority can impact their initial reasoning and judgement and sometimes make them stop reasoning at all. My view is that mentions of an authority are useful for calibrating our judgements after we tried to reason on our own but not before that.
And yeah, I have no opinion on the original post. Just like to go off on a tangent once in a while.
The same goes for health advice but this time it's 99.9%+ - if you're not in the field you can just listen and hope you are good at estimating who is more credible or likely to do better research or more truthful claims. Trying to evaluate them yourself is a recipe for being wrong and creating your own bubble.
If the article says: random guy X says quantum computing is a scam because Y there is nothing I can take away from it because it's very easy to make Y both incorrect and plausible sounding to me. If I know it's Oxford physics professor who makes the claim I can learn that Y is at least serious enough reason to not be easily dismissed.
Appeal to authority is bad as an argument when people knowledgeable in the field try to debate a certain point. In other cases it's very useful to know who makes the claims and very often it's the only thing that gives the claims credibility.
I've always wondered if the format of long form expert opinions could be replaced by a knowledge graph that is independent of the expert.
E.g. instead of article "Economist John rejects minimum wage"
Root node "Minimum wage is not the best solution to problem x" -> because -> <node to define problem>, <node to define alternative solutions> -> because -> <leaf nodes of studies or models>
In this way other experts could add to the graph and the differences between different branches of argument could be more easily compared or automatically updated. Articles could still be written, but could reference specific nodes or edges of the graph which adds clarity to the discussion.
Basing the perceived quality of an article on an appeal to authority doesn't make much sense either.
The Royal Society's motto is literally "take nobody's word for it."
Someone hasn't been watching the cryptocurrency markets.
That's partly tongue in cheek. But there are countless examples of the market remaining irrational longer than one can stay solvent.
Witness the continued success of BTC and Ether, amid newer options that outperform them on every tech-related metric, often by many orders of magnitude. I conclude that marketing hype and the first mover advantage form the vast bulk of valuation in a novel tech that people don't understand.
This is not to take away from the author's point at all - I would hope that anyone who invests in quantum computing reads the criticism from an insider who can actually read the papers.
However, as irrational and harmful as it is, I don't expect BTC to drop to zero before the day quantum computing actually does follow through. Rationality really isn't our thing.
Well said. Having worked in the blockchain space as a developer and founder since 2017, I've also come to the same conclusion. The formula for success is hype + first mover advantage - Aside from that; it's all about social climbing and politics around those projects.
It's surprising how long the first-mover advantage advantage lasts and it's weird to see that even developers who should know better are getting pulled into learning poorly designed (or outdated) technologies. They're conflating the financial achievements of projects with their technological achievements.
I guess that's what happens when big investors are laser-focused on making as much money as possible instead of also trying to drive innovation forward.
IMO, the inability to separate the two is a major reason why we have such significant financial bubbles in the tech sector.
It's really not a surprise it's not about tech when the goals and challenges were never technical.
On the other hand in areas where it is about tech we see superior one winning over established players all the time. Google, WhatsApp, AMD just to name three out of many examples in various times of this millennium.
Tell that to Wikileaks; after Mastercard and VISA stopped letting people donate due to political pressure.
Tell that to people charged double digit percentages just for the privilege of spending hours transferring money across borders.
What about the Canadians freezing hundreds of accounts "linked" to civil protests (however much one may disagree with their cause).
People complain that micro-transactions aren't cost effective; but there are, right now, secure and decentralized, fee-less crypto techs; even quantum resistant ones, that allow for the transfer of millionths of a cent.
With no disrespect, I feel that people who say crypto has no use-case are being let down by their imagination and research skills.
> skipping over regulations that apply to traditional currencies
There is actually at least one cryptocurrency working on becoming recognized as a legitimate currency; Nano.
> in areas where it is about tech we see superior one winning over established players all the time.
People could immediately see that Google gave better search results; that WhatsApp was instant and free; and I don't think AMD is as clearcut an example as you state.
Notice that in my comment I stated the condition that people can't understand the tech by themselves - such is the case with crypto. People can't understand the proofs; the difference between POW, POS, and Block Lattice; whether a coin is decentralized or not, etc, just by using it.
Arguing against myself a bit, I did think people would notice how slow and expensive their BTC transactions were; or how awkward setting an account up is, or how often claimed roadmap milestones were put back - but they haven't. I don't think I accounted for the sheer volume and reach of bag holders and maxis.
BTC uses the same amount of power as Argentina - it's perverse. The people claiming it's better in any way, on any metric outside adoption are just wrong. The fact that BTC holders and miners can claim otherwise, and people apparently believe them, is kinda my whole point.
It feels like there's a strong parallel with QC, or nuclear energy, or even cannabis. People get deluded by hype from vested interests if they have no way to accurately judge for themselves.
What outperforms Bitcoin in terms of decentralization, scalability, and resilience?
[0] https://docs.minaprotocol.com/static/pdf/technicalWhitepaper...
I say this because:
1. They say nothing about the breakthroughs in quantum error correction that is allowing IBM to promise a leap from 89 qubits today to 4,000 qubits in 2025 (still not enough on its own for a cryptographically relevant quantum computer - CRQC0, running Shor's algorithm for exponential speedup in breaking e.g. RSA 2048, which some research suggests would take 20M qubits including those for quantum error correction)
2. He did not mention Grover's algorithm which provides quadratic speedup (for time complexity of searching for a particular string in an unsorted list of N items) over their classical counterparts. However, even quadratic speedup is considerable when N is large.
3. He did not mention the breakthrough by University of Chicago researchers that showed multiple quantum computers can be entangled over tuned optical fibers to act as a single quantum computer. This still doesn't mean that we can go from 4,000 qubits to 20M by networking 5,000 of the quantum computers IBM promised for 2025, in 2025, but it provides a trajectory for networked quantum computing as a horizontal scaling strategy.
4. He did not mention the $100B allocated this year by the Whitehouse/Congress for CRQC research.
What is his motive in giving us such an incomplete story with such a skewed conclusion? Is he working for a hedge fund that is shorting some stock? Or is he just a lay person trying to sound intelligent by writing about a field where they're not sufficiently informed?