> 25% increase every year
Is there anything like Moore's law for optical cable bandwidth?
Though the improvement in transistor economics has definitely benefited transport, the large bulk of improvement over time is due to breakthroughs in manufacturing, materials science, semiconductor optics, and signal processing.
In the past five to ten years, there has however been a discussion of wether this can continue and if we hit upon the speculate “capacity crunch“ – aka reaching the operational fiber capacity. This is all considering a single fiber strand.
The limit to the rate through the fiber is not well understood and a quite active research area. Previously, the “limit” has been broken by technology shifts, such as coherent transmission, different amplifier technology, more advanced signal processing. The big question is what the next big shift will be. Combs, as presented in the article are a promising direction.
I mean, I'm guessing it's just saying the bit rate depends on the signal frequency, but it's still funny to see canceling units.
Related: a "spacing" measured in GHz.
Again, it makes sense, but it struck me that someone could write a total spoof paper with nonsense units and I wouldn't be able to tell the difference!
The fact that the time-unit cancels out just shows that spectral efficiency is independent of any sense of time one might have.
Measuring "spacing" in GHz makes sense if you consider the way heterodyne mixing shifts the signal around, without affecting the spacing of sub-carriers.
Frequency spacing makes sense as namibj points out. Most long-distance telecom links operate in the optical C-band, which is roughly 5 THz wide. (A wavelength of 1525nm has an optical frequency of 196.5 THz, and a wavelength of 1565nm has an optical frequency of 191.5 THz). You can select optical frequencies to modulate within this optical bandwidth. Given a certain modulation rate (>>GHz), separating the channels in units of 1 GHz is reasonable.
[0] https://en.wikipedia.org/wiki/Shannon%E2%80%93Hartley_theore...
One can keep going to higher order modulations to improve the spectral efficiency but the SNR required increases exponentially with constellation order.
Of course, "one day perhaps" this could change.
[0] https://www.cablelabs.com/forward-error-correction-fec-a-pri... [1] PDF warning: https://www.infinera.com/wp-content/uploads/Soft-Decision-Fo...
Old guy here: I'm wondering when that changed. Early OC-768 requirements were 10^-12 BER preFEC (e: at least in short-haul), which was down from 10^-9 for OC-192.
- BER = Bit Error Rate
- FEC = Forward Error Correction
- TANSTAAFL = https://en.wikipedia.org/wiki/There_ain%27t_no_such_thing_as...
It seems be error correction add to the sending stream to 'mend' broken data, vs error correction by the receiver detecting an error and requesting a re-send.
On the other hand, high BER doesn't preclude usefulness, one just would need different FEC to deal with that. Maybe a pratical implementation with a robust FEC will still come ahead in useful throughput.
Why would applications even care about the BER before FEC?.