I've wondered if FaceID and the Android counterpart are actively creating an extraordinary labeled dataset for facial expressions at the point of sale.
With users trained to scan their face before every transaction, tech companies could correlate transactions to facial expressions, facial expressions to emotions, and emotions to device content. I can imagine algorithms that subtly curate the user experience, selectively showing notifications, content, advertising to coax users towards "retail therapy".