Brands realize that the firehose of social media messages about them on the Web cannot be analyzed without automated tools. (Photo Source: Defense.gov)
Yahoo’s Tumblr is the embodiment of talking with pictures. Where else can you find daily devotees to the Humans of New York, Seriously Ugly Race Pics or Fashion Tips From Comic Strips? Boasting more than 138 million blogs, and 62 BILLION posts, Tumblr has become a treasure trove for brands to learn what people want to eat, drink, wear and watch.
So when Tumblr signed a deal with DataSift this month to make its full firehose of consumer data available to brands, the advertising world took notice. Popularity doesn’t always mean profitability, of course. For Tumblr to monetize their 19 billion monthly page views, advertisers need to know exactly who is visiting and why.
According to TechCrunch, Tumblr chose DataSift because of its performance as a content partner with Twitter. DataSift is one of only four Certified Data Resellers given access to the full stream of public Tweets, both in real time and in archives. The other data analysis company partners are Gnip, NTT DATA and Topsy.
Twitter is very selective about how many partners have full access to their firehose. Currently, there are only 4 Certified Data Resellers. (Double click for more info)
Below is an example of how DataSift breaks down a Tweet into consumer data. Using natural language processing and text sentiment analysis tools, the company measures whether a conversation about a brand is positive, negative or neutral. It takes into consideration the content — page title, full URL, metadata — of any shared links.
The sender of the Tweet is also categorized by age, gender, geography and their social media influence. Hashtags and keywords are factored in as well.
The Anatomy of a Tweet — Now multiply by a billion. (Source: DataSift.com)
As the science of extracting social media data continues to progress, we’re particularly excited about what’s next in the technology pipeline. Currently, all the approaches to understanding and leveraging consumer conversations revolves around text analysis. When photos are shared, those pictures are contextualized through their URLs, captions and hashtags.
The next step is to “see” and extract customer data from the pictures themselves.
Pongr’s image recognition technology identifies the brand logos, brand packaging and advertising that are organically part of consumer photos. For example, when your friend snaps a selfie at the family breakfast table, what kind of butter or cereal is everyone eating? They won’t tell you because the photo might be focused on their homemade waffles. They also won’t tell you because people don’t naturally share photos for the benefit of advertisers — unless there is an incentive like coupons or prizes.
One of the more common posts on my personal Facebook news feed are photos of friends’ beverage cups at Starbucks. If you’re not a coffee drinker, let it be known that Starbucks has (unintentionally?) turned the simple act of writing your name with a Sharpie into a social media sensation.
Several of my friends regularly post their personalized cups to share joy over their Starbucks break or to marvel at how their name is always misspelled. In any case, many times they don’t even write a status update. The picture stands for itself. To friends of Marco (below), this pic represents his ongoing frustrations at being labeled as Mark, Marc or ongoing creative variations.
Personalized Starbucks cups make frequent cameos in Facebook news feeds, but they aren’t always captioned or hashtagged. (Source: Facebook)
When brands now try to extract actionable insights from the firehose, they are missing out on countless silent conversations — the ones using only photos without hashtags or commentary.
By adding image recognition to the mix, analysts can detect that Starbucks mermaid above. They can even see other products that seem unrelated to the post but might present a meaningful buying pattern.
We see a continued convergence between visual and text-based sentiment analysis. We envision a growing demand for what we call “visual listening.” What can consumer photos tell us about what they’re buying, where they’re buying it, and when?
The next step is applying computer vision technology — adding in-depth photo analysis — to brands and agencies’ current social media listening. Only then will we truly get the full picture of what real people are saying.
(Learn more about how Pongr technology helps brands and agencies monetize the stream of consumer photos shared on the Web.)