> Regarding the similarity to a double slit interference pattern, what I found in my investigation (see my previous comment) is it was solely the result of the sorting algorithm used for the X coordinates of each possible outcome/weight. Ordering by the average X location results in a binomial distribution, but there are still ties to deal with. Applying secondary sorting to those based on the position of the left or rightmost X creates secondary binomial distributions within the main binomial distribution, similar to what Jonathan showed, except without all the spikiness because his example wasn’t consistently sorted in this manner.
> You can see the sorting I’m talking about by looking at the sorting of Xs in the output of weights in Jonathan’s double-slit examples. The bottom half is sorted as I stated, but the top half is somewhat out of order, corresponding to the spikes in the tallest curve in the center of the graphs.
> Maxime C, I had the same question as you – Why choose this sorting method for encoding the photon positions? My conclusion is that it doesn’t work however. Each additional step run with a large string produces an additional secondary binomial distribution. To get a smooth curve, we’d need to run billions of steps on a large string, but then we we’d also see billions of peaks, which no longer compares well with the expected diffraction pattern.
For example, Werner Heisenberg's doctoral thesis[1] arose from a contract of his doctor father Arnold Sommerfeld from a company that dealt with the channelling of the Isar river through the city of Munich. Very practical problems involving differential equations - kind of the bread and butter of physicists and engineers at the time.
What if quantum mechanics was found today in a world where the bread and butter has shifted to computer science, linear algebra and discrete math? Would we still end up with waves and differential equations, or would another formulation arise more naturally?
EDIT: I think a beautiful (but imperfect) example to illustrate this dichotomy in the ways of thinking is how the Bell inequality can be approached with photons and polarization or as a game. Thinking about Alice and Bob or polarized light, which would you prefer?
There was considerable disagreement between the factions of physicists who favoured the different versions which essentially ended when after some considerable theoretical effort (mostly by Dirac) it was shown that the two pictures are exactly equivalent.
Physicists still use whichever formulation is most suitable for whatever problem they're trying to solve, for example if you're analysing the something where you care about a bunch of bound states like the simple harmonic oscillator or the hydrogen atom then the matrix picture tends to be easier to work with.
You are right that wave mechanics was more popular than matrix mechanics because physicists were already very familiar with wave methods.
Heisenberg's matrix mechanics was the first, and formulated entirely in terms of linear algebra. See for example the introduction of basic linear algebra techniques in the famous Bohr-Jordan paper from 1925:
For me it's a reminder that physics describes how quantum systems evolve, but it doesn't (in a sense) tell us what those systems are. Are particles waves? Matrices? Excitations of a field?
Each of these descriptions works, so I can't exclusively say any one of them is what particles really are. Bring a macroscopic being is philosophically frustrating.
"Quantum mechanics is what you would inevitably come up with if you started with probability theory, and then said, let's try to generalize it so that the numbers we used to call 'probabilities' can be negative numbers. As such, the theory could have been invented by mathematicians in the nineteenth century without any input from experiment. It wasn't, but it could have been."
Well worth a read. In fact, I'd say it's worth buying the book just for this one chapter.
I discussed the difference earlier: https://news.ycombinator.com/item?id=38255476
Before this I didn’t know Stephen hosted “Live CEOing” sessions and now I wish this was the norm!
0: https://www.twitch.tv/videos/2083073452 (timestamp around 50:00)
https://www.preposterousuniverse.com/podcast/2021/07/12/155-...
Looking at the numbers on the graphs for single-slit diffraction, they are just binomial coefficient, at least mostly, not sure why there are pieces missing in the last rows. That is also what you expect when you repeatedly make binary decisions to go left or right. The article does not mention the binomial distributions once, it only appears in a comment.
And then they claim that it converge to the actual single-slit diffraction distribution, something with a Chebyshev polynomial and the sinc function, according to the article. Seemingly without justification besides looking at graphs and noting that they are both bell shaped. As said, not sure what is going on in the last rows of the graphs, but I would almost bet that the two functions are not the same, even in the limit as it becomes a Poisson distribution plus whatever the last rows do.
Why do they not just proof that the two are the same? The entire article seems to be about getting numbers out of their multiway system and then concluding that - if you squint hard enough - they look somewhat like diffraction patterns.
A Gaussian distribution, I think. But they're certaintly not the same function, and it should be immediately obvious to a math grad with experience in physics. The sinc function, for one, has secondary maxima (its plot in the article is very convenienty cropped to allow pretending those don't exist). Just put a hair in the path of a laser beam and you will see the local maxima in light intensity! Their "single-slit" string procedure, on the other hand, can only generate a single central peak. This really makes no sense at all.