New stuff, so probably not good to force old words, with known meanings, onto new stuff.
Training Code and dataset are analogous to the developer who wrote the script
Model and weights are end product that is then released
Inference Code is the runtime that could execute the code. That would be e.g. PyTorch, which can import the weights and run inference.
No, I completely disagree. Python is near pseudo-text source. Source exists for the specific purpose of being easily and completely understood, by humans, because it's for and from humans. You can turn a python calculator into a web server, because it can be split and separated at any point, because it can be completely understood at any point, and it's deterministic at every point.
A model cannot be understood by a human. It isn't meant to be. It's meant to be used, very close to as is. You can't fundamentally change the model, or dissect it, you can only nudge it in a direction, with the force of that nudge being proportional to the money you can burn, along with hope that it turns out how you want.
That's why I say it's closer to a binary: more of a black box you can use. You can't easily make a binary do something fundamentally different without changing the source. You can't easily see into that black box, or even know what it will do without trying. You can only nudge it to act a little differently, or use it as part of a workflow. (decompilation tools aside ;))