Imagine somebody makes a mind out of machine learning. Passes the turing test and more. It reports to "feel", ie to have qualia. Is it parroting what it hears/reads from humans? Or does it actually have a feeling when you show it an image of a sunset? At what breakpoint do you place your debugger and inspect if it is so?