In the US, it wasn't understood that early language access is essential to infant brain development until the 80s, at which point ASL gradually replaced clumsy attempts to teach English. So there is now a whole culture (capital-D Deaf) that are ASL users all day with English as a second language.
Back to this context, there might always be some TTD and caption phone users, and captions can be auto generated now. We're probably at a point where a true AI video interpreter is possible. Good ASL training data is probably going to be the bottleneck.
For hearing people, the trend seems to have been younger generations are more into technology, so it's surprising to me that for deaf people it would be the opposite.