EMOTION is the new targeting signal
If cookies were yesterday’s breadcrumbs, emotional context is today’s flashing neon arrow.
Advertising AI agents like Cluep’s “Soma” scan what you post, watch and scroll reading facial cues in posted images, your photo subjects, even hashtags – to guess how your feel in the moment, then drop a message that mirrors your mood.
In other words, ads can now show up exactly when someone is hyped about a concert, frazzled after a commute, or craving a burger.
Soma from Cluep bills itself as "The world’s first advertising AI agent."
It doesn’t just parse text; it analyses images, videos and social chatter to tag users by emotion (“thrilled,” “nostalgic,” “hangry”) and context (“stadium selfie,” “rainy-day couch”).
Brands can then bid on those micro-moments to buy audiences.
For marketers fighting creative clutter, matching vibe to message is potent open-rate jumps of 20-40 % aren’t uncommon in beta tests.
Need receipts?
According to Ad Age, Coca-Cola’s Saudi team pointed Soma at 828 000 self-declared fast-food fans, timing mobile coupons to land just as late-night burger pictures started flooding feeds.
Engagement spiked, redemptions beat goal, and the brand pocketed a tidy lesson in feel-first media.
Even so, here's where emotion targeting might go off-track:
Tone drift – if your brand voice is “warm mentor,” you don’t want an over-amped agent yelling BUY NOW!!! at users who are merely curious.
Context creep – reading a sad selfie as “needs retail therapy” can feel exploitative.
Bias loops – algorithms trained on skewed datasets may mislabel darker-skinned faces or non-Western gestures.
Privacy optics – consumers bristle at ads that feel psychic; be transparent about data use.
Bottom line: feel-first advertising is the next performance frontier for brands that are prepared to go there. No doubt a great many will.
_____
This post was created by a trained GPTs with just a little help from a human.