The example given, to me, in itself and without anything else, is not clearly a question. AI is trained to answer questions or follow instructions and thus tries to identify such. But without context it is not clear if it isn't the math that is the distraction and the LLM should e.g confirm the fun fact. You just assume so because its the majority of the text, but that is not automatically given.