> Do you know what happens if you accidentally couple lines during one of the manufacturing steps?
I understand the consequences. I also understand both both physics and computer science. A 32-bit integer is sufficient to subdivide something the size of a wafer mask to well under the wavelength of the light used for photolithography. There is literally no way for additional precision to matter for things like "coupling lines". It is impossible.
See for yourself: https://www.wolframalpha.com/input?i=3+cm++*+2%5E-32
Iterated algorithms are a different beast entirely, but there are fixed-point or integer algorithms that sidestep these issues.
You cannot imagine the volume of computer science research that has been written on shape-shape intersections in both 2D and 3D! Literal textbooks worth. Hundreds if not thousands of PhD-level papers. The sheer intellectual effort that has gone into optimisations in this space is staggering.
Hence my incredulity. I've worked with 128-bit numbers and even arbitrary-precision numbers, but only in the context of computational mathematics. There are no "physics constraints" in mathematics to limit the benefit of additional range or precision.
Also, the financial argument doesn't hold water either. Modern chips have tens of billions of features. The data volume can exceed the size of main memory of even the largest computers. Data representation efficiency and simulation speed absolutely would have tangible business benefits: faster iteration cycles, lower simulation cost, better optimisation solutions, etc...
This is literally the point of the article -- being able to do things in GPUs using their native 32-bit maths capabilities is a huge benefit to the chip design workflow. This requires clever algorithms and data structure design. You can't be wasteful because "it feels safer" if you have a budget of 24 GB (or whatever) to squeeze the mask data into.
> assume the experts in it are wrong?
Yes! Something I've noticed is that there is surprisingly little "cross pollination" between fields. You can have very smart people in one industry blithely unaware that another industry has solved their "very hard problem". I've seen this with biology, physics, medicine, etc...
How many chip design automation experts have also done low-level game engine programming? Maybe half a dozen in the whole world? Less?