98% of Social Media Survey Results are Insignificant

This morning I’m sitting at my desk reading the latest social media news on my feed reader – when once again I fall for a social media survey.

You know what I’m talking about – those surveys by advertising agencies or PR firms that announce the “latest social media news” of social media adoption, or announce the “readiness” of marketers to use social media in their marketing plans.
And of course, I fall for it every time. This morning it was this article that lured me in with it’s claim that the use of social media tools – such as user comments, blogs, photo sharing and more – on business media web sites, “rose by more than 75 percent in the last 6 months”.surveyresults.jpg

After scouring the article for where these results came from – I found nothing. Was it an oversight? Or was it an attempt to send out another “shocking” survey to get press coverage? After seeing a trend in social media surveys, I’m starting to think it’s the latter.

In fact, I’ve realized that these surveys typically have the following in common:

1) Small sample sizes. A sample size of 100 or 200 (like in this article) does not mean that you can make vast assumptions about marketers as a whole. Give me something significant – then I’ll use your statistics.

2) Sample Bias. My favorite samples are those who survey marketers at emerging media conferences. Isn’t it blatantly obvious that this results in sample bias? No wonder why the results are shocking.

3) Undisclosed Survey Methods. Like the release I read this morning, I was left in the dark regarding the methods used in the survey. I had no idea how many people were interviewed, what survey instruments were used, or who conducted the survey. For all I know the survey was conducted by a poll of the firm’s clients or closest friends. Transparency should apply to research methods just as much as it applies in social media.

So I hope this post will encourage you to use a level of discernment between surveys that are statistically significant and those that are not. Of course my projected 98% is from my own personal experience and was only intended for irony – but I wanted to make the following case: If you are considering working with a social media agency, advertising agency, or PR firm – think long and hard of the value they place on significant research. The last thing you want is the results of your social media campaign to be equally skewed.

Related Posts

3 Comments
  • Durga
    Posted at 05:45h, 27 October

    Lisa,

    I somewhat agree with what you say.. Infact, I am working on a social media survey and was searching for some ideas and found your article. But the only thing .. I did not like is you using numbers again.. “98% of Social Media Survey Results are Insignificant”.

  • Jason A. Howie
    Posted at 01:40h, 05 January

    As a research assistant I can tell you it’s very common to see these flaws in research projects. Unless a company is going to hire someone with a doctorate and a research background there are going to be wild claims, with little supporting evidence. You’re points in this article are very valid.

  • Jason A. Howie
    Posted at 01:40h, 05 January

    As a research assistant I can tell you it’s very common to see these flaws in research projects. Unless a company is going to hire someone with a doctorate and a research background there are going to be wild claims, with little supporting evidence. You’re points in this article are very valid.

Post A Comment