In many ways, hiring advanced degree holders is a crapshoot. They have skills you probably can't train, but often times come in with fewer software development skills than your undergrad intern, despite theoretically having more years of experience. You don't need to know git to publish in IEEE, or write unit tests or readable code, or debug an edge case, and your only code reviewer is a professor who doesn't care about this either. 'Good enough to publish' is a far cry from 'customers will pay for it.'
I'm not saying that the OP should hire these people, or that they would even be good hires. Maybe they don't want to take the risk or time of training up a new employee.
It just seems like a bit wierd non-sequitur to draw conclusions on someone's learning capabilities based on their performance in an interview outside their usual domain.
1) Academics lack
2) Are hard to pick up
E.g. another commenter mentioned Git. Many academics don't use version control, but I think learning how to do this to an acceptable level is not very difficult, so it doesn't really count.
I'm asking because I might be trying to get a job in industry soon.