Anyone older cut their teeth on assembly or C, NetBIOS, and CLI and may or may not have stayed current afterwards. Consider that the industry was a lot "older" (just look at old pictures) until those technological shifts happened and shook out all the people that learned in the 60s or 70s.
If something like quantum computing or a parallel functional computing revolution or VR/augmented interfaces or anything else grossly disruptive to current development practices comes around, we're all going to be in the same boat: learn or die. Getting through that is not so much a factor of age as the circumstances around age and one's willingness to keep at it.