not true. ANNs were one of the first general purpose functions. Once people got their heads round ANNs properly they got generalised to kernel methods. So now modern machine learning often don't look like neural nets anymore, but the lineage from back propagation is obvious.
Deep belief nets have made things that look like neural nets popular again, but things like sparse ICA stacks is suggestive of another wave of theory led morphing back to non-neural architecture. But the point is that the theory followed after the invention of neural architectures both times. So the root is the ANN, and thus I think it is a really important algorithm in history. It was a bio inspired innovation too which is interesting.