Monday, March 26, 2012

Social media revolution or evolution?

Its 2012...and frankly it is starting to move very quickly.  We are almost finished the 1st quarter of the year.  As I am now entrenched in the maturation of a new marketplace (not only the creation of social media but the usage of it for running the business), I am starting to think about what key factors will drive the success or failure of those trying to help companies successfully use the data to their advantage.

There is an ongoing battle that is being debated one company at a time.  It is this...what is the best way to tap into what I fondly refer to as THE CROWD?  This is using all the social media data available to understand consumers in their new natural habitat.  The questions that are troubling everyone at this point in the debate is the following

1. How much data is enough to make decisions?
2. What are the metrics that will help you measure its impact?
3. Can you trust the dataset that is being created on the web?
4. Can I trust the content?
5. Is is getting the best quality data more important that getting all the data to make decisions?

This is essentially the PRECISION versus RECALL debate.

Precision by definition is getting the most accurate data to make decisions.  Recall is not missing a single piece of data when deciding what to do.  It is a dicussion that has been going on over the last 24 months.

As a member of the precision camp for more than 5 years, I have defended the following concept for a long time.  I used to say what is key is being able to collect the right data set the same way everytime to make sure that you are comparing apples to apples with every study you do.  Call it the way of the scientist (which I am) because I have been trained to carefully manage variables to make sure that only the right study is done.  With only the best data, I have argued that you can do your work to understand consumers in the social media sphere.  I have championed that it takes many measure like the ones I have access to to be able to really create meaningful business knowledge and metrics (that is buzz, sentiment and passion data).  The whole time I have been able to very easily prove time and time again, that I can develop insights and create social media theories that will push the envelope. 

I am here to say that I have been wrong...dead wrong.

In the last 3 months, I have been able to learn something critical to success of any social media program.  In order to make the shift across the enterprise, it takes BOTH precision (my camp) and recall (the other camp) to really tell the complete story for the business.

Why?

Three reasons

1.  Recall is critical to surface issues being discussed acros social media.  I have had some experiences with some methodologies (can't discuss today or yet) that are showing me recall type measurements are outstanding and way more senitive for surfacing an issue.  If you can grab the key discussion points from all that is being said, you will get to the what more effectively.

2.  Precision helps you understand why.  I have one customer who I work with that has shown me how they help their constitutents understand what consumers are saying online about their brands.  What I learned is this, they cannot get the depth of insight they need from simple recall.  Recall is the fluff and precision represents the depth in studying consumers online.

3.  Both precision and recall will be the key to unlocking one's ability to measure the ROI from social media.  What do I mean by this?  Anyone who has spent their time trying to figure out this problem gets too stuck creating data breakdowns that lead down the path of someone asking them so what.  The reality and the next big thing in social media validation is going to come from leveraging the speed at which data can be collected (the buzz, the sentiment and the passion...the recall and precision) will be mixed with the data that companies trust. 

The fight over how to best leverage social media data to run the business has been always founded in what I consider a culture change.  Even when you steadily prove what it can do, there is someone who is always skeptical of what you did.  This is the nature of culture battles.  Social media analytics has always been asking people to use new methods but also to use a new set of data.  To best overcome the changes needed by the organization, those interested in proving its worth will need a way to mix what is trusted and what is new.

Precision and recall will be the path to success.  Mixing all the data (so those wondering if they are missing something can be appeased) with the most accurate ways to measure consumers' opinions is the first step.  The second is building trust by making sure measures (sales, units, margin...etc that are near and dear to their risk adverse hearts is part of the solution not an old friend to be cast away for some thing new an better.

That is all I have to say now...but stay tuned here to the proof in the pudding...

I will go Social Medieval on this point soon...

stay tuned!

Thursday, March 22, 2012

Features, Content and Accuracy...Oh MY!

I figured I would write a short post from the trenches.  I wanted to pose a question to the masses...


How do you know that your "counts" of sound bites are even accurate?

I have been spending an enormous amount of time working with people across the social media wild west and one of the things I cannot get my head around is the battle of numbers in social media data.  Culturally, we are shifting the very foundation of what data means from a place of strict quantification to a place of the more data the better with blind faith.

As a scientist who is trained in variable management, I was always taught that there is no bad data only data.  And this tenet is surely true as the field of social media measurement continues to mature slowly.  That being said, I am surprised at how much push people are asking in the area of quantificaiton, but rarely question how the data was counted.

Yes, it is imperative that when someone tells you a sound bite is positive that it truly is, but for any technology company, trying to live up to the perfect coding standard is impossible.  Then I will see someone question how many sound bites they have on their brand.

I am shocked that people are so quick to declare victory of their facebook fan page likes, but are so loathe to stop and think when you tell them that their multimillion campaign on generated 7500 soundbites naturally.  Why is this the case?

Yes, social media data is extremely powerful.  The data is vast and the potential for real time understanding translating to business results is real and will become tangible as this year wears on, but to simply say more data is better without asking how that number was generated is folly. 

I figured I would post some thoughts on how to think about social media analytics to "clear the air".  We all want our tools to look pretty and do wild things.  But what else is there to consider?  I would argue there is more than one thing to think about when getting a great tool.

In fact, there are three critical components to think about

Features - This is the obvious part.  This is about bells and whistles, slicing and dicing the data so you can mine it and use it.  Most peopl solely focus on the ease of use that a tool has.  And by the way I couldn't agree more with that thinking.  Many tools are very powerful but very difficult to use.   Having something that is flexible, powerful and nimble is critical.  This is becoming table steaks for anyone interested in building something good.  But frankly if you have a great tool there are two other things to think about...

Content- This is the point I am trying to make here.  Did you get ALL the content?  Or better yet, did you get ALL the CLEAN content?  Here is the distinction...if you get 100,000 sound bites and 30,000 are duplicate posts it doesn't matter how great  your features are, you now have crappy content and you are comparing apples to oranges in doing your work.  Many social media junkies will focus solely on the numbers and not on whether they are accurate.  Are you asking that feature provider to open their kimono so they can tell you how they get there content?  Do you know how they verify their streams?  I think people need to spend a whole bunch more time on this issue before worrying about how pretty the data is sliced and diced.   Why?  Becuase if you are so worried about the ROI of social media data then your content and thus you counts better be good or you data is bad and your reco is flawed.

Accuracy - Again if you are trying to understand what social media is telling you (not listen) then you need to be concerend with this small thing.  No one is going to get you all the way to 100% accuracy.  At least, not until skynet is built and we all go down in a ball of MATRIX flames.  But for now, after you get your counts right,  you need to make sure that sentiment is accurate.  There are a lot of claims out there as to how to best calculate it.  While I work for an NLP company, I won't argue that we are the end all be all, but we do go for accuracy of the smaller sample set as one of our targets to help you know WHY people feel a certain way.  Accuracy is the key to good sentiment analysis and critical if you want to get away from coding a thousand soundbites by hand...ugh.

Why am I writing this?  Because I see a lot of resistance from people I work with when trying to get comfortable with social media data.  It starts with not asking question about what appears to be bigger and better.  It is about making sure that what is under the hood is in working order or you just bought a car that has a beautiful body, paint job, awesome seats, great stereo and an even cooler steering wheel, but is powered by a YUGO engine under the hood.  That car is great to stand around an party and look good but could get you to the liquor store to buy a case of beer...

Tuesday, March 13, 2012

Looking at the space between the cracks - Between the theatre and DVD

In the every continuing quest to figure out how to apply social media to business, I continue my mini use cases with a new look at something different.  In the past, I have looked at politics a ton.  Most recently, I looked at the concept of the social media fingerprint (which in my professional work continues to bear fruit as being an interesting concept) and today...what about white space?

What do I mean by white space?   What happens between the launch of a product.  You can apply this idea on longer time frame the first launch of product to its product improvements.  In fact, some would call this launch analysis because the time frame between product launch and improvement can often be months.  Another good example would include monitoring a test market of new product or service.  Many times if one studies what occurs during the test market there is a great chance to learn something idea before you go full launch.  The focus of this study is to look at another shorter time frame white space; between when a movie launches and when its dvd goes live. 

What is amazing about this particular use case is that the time frames are getting so short between when movies hit the screen and when we see them at home that social media will become a critical methodology for studying such white space. 

In the analysis I will do today, I chose to look the movie Cars 2.  The reason for this choice is it launched last year and also launched its DVD in 2011 as well.  It's box office totaled $191MM in the US and $368MM outside the U.S..  In addition, it was ranked the #4 most successful DVD sales in 2011 with $71MM dollars.  The Movie hit the theaters June 24th, 2011 and the DVD launch occurred on November 1st. 2011.  Below is analysis of the launch, white space period and after the DVD launch.

First Lets take a quick look at the net sentiment and summary metrics for the entire year around the topic of Cars 2. 


This chart gives a very good overview of the movie Cars 2.  You will see that net sentiment for the film is pretty strong overall.  What is good about this full year look is the obvious spikes that occur in sentiment around the time of the launch with a smaller spike during the DVD release.  This correlates with one would expect from the movie launch.

As we analyze the concept of movie white space, we will look at three quarterly periods.  The first is May, June and July.  This will represent the launch of the movie and about six weeks after launch.  The white space period will include August, September and October.  This is perfect as the the DVD launched on November 1st at the end of the white space period.  And after the DVD launch will be the third period which includes November, December and January.

Let's look at the launch to get a sense of what people were saying.  I will look at 4 things here.  The first is the net sentiment trend for the period that you see above and the summary data.  I will also look at the sources breakdown and the top 10 likes and dislikes for the period.

For the launch...




You will see a blow up of the launch period.  In this period, we see the majority of sound bites.  In addition, the net sentiment and passion intensity scores vary little from the overall data set.  As for the source breakdown, almost 87% of the traffic comes from both social networks and twitter.  These two sources often represent mainstream discussion.  One would expect during the launch that much of the chatter would occur on these channels. 

As for likes and dislikes...some of the highlights are sort of interesting.  For one, the major like is the movie trailer.  If you remember this movie didn't receive very strong critical acclaim.  It seems that most people were excited by the concept of the movie more than its delivery.  In fact, none of the other likes (while voluminous) are not very strong.  They include like good movie and okay. These are not ringing endorsements of the film.  It is interesting to note here that this movie did almost over 600MM in sales with only about 33% coming from U.S. sales.

As for dislikes, we see there is many mentions of the less than critical acclaim.  In fact, rotten shows up (which is focused on rotten tomatoes less than fresh rating of 38%. 

Let's move onto the white space breakdown. 

Below are the same set of charts we saw for the launch period. 


The summary metrics for this period look very similar to the launch period except for the fact that the volume is much lower.  You will see more peaks and valleys in the sentiment postings.  But as you move into the likes and dislikes you will see some differences.

In the likes category you can see some of the Pixar's marketing efforts are emerging. The number one mention is the Cars 2 App.  At the launch of the movie, the app launched and was a huge success on the marketplace.  In addition, we see mention of the Cars 2 toys.  What is evident here is the fact that Pixar's marketing efforts are keeping buzz on the movie going beyond its stint in the theaters.  This is very interesting because it does show that the concept of keeping the movie top of mind could be a critical component of the white space period for films between the theater and the DVD launch.

What becomes most interesting, however, is the change in the sources breakdown.  Previously, 87% of traffic came from consumer channels like twitter and facebook.  During the white space period, this number drops to 64% (a total change of 23% on an absolute basis).  Subsequently, the blogs and forums have gone from 18% to 31%.

What does this mean?  It means that the social media fingerprint is a real thing.  You can read about the hypothesis here  (link to Iran blogpost on social media fingerprint). As I have written previously, blogs represent a place where experts speak to the world and forums are those who like a topic love to chatter at each other.  Facebook is friend talking amongst themselves.  And twitter is the individual shout out to others letting them know how they feel. 

How does this manifest for the movie industry.  My hypothesis here is that to best study the consumer for a DVD launch, one must look to the blog/forum chatter to connect with those who will buy the DVD.  These people are talking about there thoughts and usually movie buffs are interested in movies.  This isn't the greatest example because in the case of Cars 2, a children's movie, parents will simply buy it.  But the fact that these channels grow in chatter suggest it is a robust place to study.

Lastly, I will post the after launch period.




This period shows some continued evolution.  For one, from a likes perspective we see two pieces of pie around take flight and thrill.  Both of these slices center around marketing of a new cars 2 toy (presumably for Christmas).  Again...the marketing continues and keeps the momentum for the film.  On the dislikes side, there is a unique mention of not get nomination.  The chatter continues for the film.  The volume is reduced, but there continues to be evolution in the topic.

But again, the forums and blogs percentage has again gone up to 40% from 31%.  This shows that the continue dialogue around Cars 2 eventually is centering on passionate people discussing the film and not mainstream consumers (when compared to the launch period)

Let's look at the brand passion.

the first is a chart of the brand passion during the three periods. 

this chart shows the the rotten tomatoes score of 38% is about accurate.  In this quadrant which looks at buzz (size of bubble), sentiment (y-axis) and passion (x-axis) to get an overall view of the brand across time, we see that people simply liked this film.  And we see that during the white space period there is a decrease in passion while it increases in overall passion after the DVD launches.

This helps us get  sense of what was going for the film over time.

I also looked at the white space period's BPI with a breakout of the sources during this timeframe.

It is below...

Here we see how each source is contributing to the brand passion during the white space period.  This is a good guide for deciding where the passion is coming from before the launch.  It help see where the positive chatter is happening and where the negative chatter is happening.  It is a map to study your consumer prior to launch.

While there is major planning to the entire period when  a movie launches this study has shone there is great nuance to what social media can tell us.  In the case of Cars 2, there seems to be changes in the chatter during this crucial white space period.  While I have only scratched the surface in this blogpost, the concept of social media fingerprint again plays out.  The question remains will the current culture of planning and process keep organizations from learning on the fly as their critical dollars are spent on their business.

It looks like Pixar has the last laugh with its mediocre release...why...because of the 600 million box office and 150MM in DVD sales...we don't even know what the toys brought in.

Happy viewing...