The Server and Cloud Community Lead Blog

A place to share my thoughts on community and what I am doing inside of Microsoft to drive robust conversations between our Product Development Teams and our customers.

Where is the love? The SBS Community Survey is floundering with very few responses.

Where is the love? The SBS Community Survey is floundering with very few responses.

  • Comments 9
  • Likes

OK, I thought this would be like shooting fish a barrel, but apparently, this isn't the case. So far we have had 250 responses in 2 weeks. At this rate, we will set a new record for the least amount of responses in a month by a country mile.

Is everyone so busy that they can't spare about 5 minutes to take this? Has the SBS Community taken a sabatical? Does anyone care?

Seriously, folks. We care an awful lot and we really want to hear from you. Please take a few minutes and tell us what you think. I promise we will follow up on your feedback and do what we can to make your voices heard.

Here is the link:  SBS WW Community Survey 2008

 Thanks again,

Kevin

 

 

Comments
  • The WSSG Community Lead Blog : Where is the love? The SBS Community Survey is floundering with very few

  • The WSSG Community Lead Blog : Where is the love? The SBS Community Survey is floundering with very few

  • Hi Kevin,

    Firstly, I want to say that I'm one of the UK SBSC PALs and I have completed the survey and encourage colleagues to do so as well. Also, I provide a huge amount of feedback to Microsoft both here in the UK and to Corp, so I understand the value of taking timeout to do this and that the return is not always direct to me.

    But, how do we know as Partners that there will be positive change as a result of this? What were the conclusions of the last survey that resulted in changes that would impact on me as a Partner to sell more and be more effective as a business?

    To give you one example, I have asked for the Demo Showcase VMs (not the simulations) to be made available to SBSC Partners but despite assurances this has never happened nor has anyone ever explained to me why this is a bad idea?

    To be honest I hear a lot from Microsoft about getting feedback and wanting to listen but then when it actually comes to delivery then excuses start to come out ... "we've no budget this year", "it would be far too expensive to do that", "we can't actually measure the value of SBSC" ... and the list goes on. When I have to spend time (a lot of time!) explaining to other Microsoft people the value of SBSC, then therein lies the problem.

    I will continue to give constructive feedback and encourage others to do that but we want delivery and people who can achieve that.

  • I have also completed the survey but I won't recommend that my peers do the same. I share many of Vijay's concerns and have a few more of my own.

    - Like Vijay, I'd find this year's survey more compelling if I knew what the results from last year were and knew how Microsoft had responded to those results.

    - The survey refers several times to the "Windows SBS Community" without defining what that term means. Is it limited to the SBS Community Resources listed in the survey? Is that a generally accepted definition? It would be interesting to know how many people could even list what the SBS Community Resources are without any hints. How many people would consider PSS to be part of the Community?

    - It's hard to understand how the survey will be helpful. Suppose the average overall satisfaction score for the community goes from (for example) 3.3 to 3.8. What does that tell you? How can you act on such information? How do you know what's driving that number for any individual or group of respondents?

    - The concept of "value" is somewhat vague. Each resource listed can potentially be of great value or zero value, depending on when and how it is used. Some resources are better for general product and sales information, some are better for ongoing education and training, some are better for troubleshooting, etc. If one of the resources gets higher scores than another, it doesn't necessarily mean that the "losing" resource is no good.

    - Expanding on that last point, it might have been better if you had said "here is a list of 10 questions that someone working with SBS might have. For each question, how likely would you be to consult each of these community resources in search of an answer?" If, hypothetically speaking, you found that there was a 90% chance that people puzzled by an error message would go to third-party web sites but only a 30% chance that they'd go to the MS KB, then you'd know something was very wrong with the KB.

    - Your survey kind of assumes that respondents can easily differentiate between the resources that they end up using. In my experience, many SBSers turn to search engines when looking for answers. Those search engines may spit back answers from several different community resources at once. Am I really supposed to remember how many times a search engine brought me to TechNet as opposed to a KB article as opposed to something at www.microsoft.com as opposed to a newsgroup post, etc.?

    - I could go on, but the bottom line is that it's very hard for me to understand what the survey's purpose is or how it can possibly achieve its purpose, whatever that purpose may be. That is why I am reluctant to encourage others to fill it out, and it may be why your response rate is low. Perhaps there are hundreds or thousands of people who have reviewed the survey questions and have decided that it's not worth answering.

    Respectfully,

    David Schrag

  • "Because of the previous surveys, we have made some adjustments in our community engagement. To name a few; we really worked on putting more focus on the Official SBS Blog. Our Sustaining Engineering team runs mini betas with our MVPs before KB's and Bug Fixes are released via Windows Update. We try to get more involvement in our techbetas from our User Groups. The list does go on."

    Did you miss that David in the first post?

    There's a box at the bottom to fill out whatever you like David.

    I'm sorry but whenever someone of Kevin's passion for community asks, even if I was not fond of all of the questions, I still fill it out.

    By supporting Kevin, in turn it supports the community.

    You are shooting yourself in the foot David.

  • Thanks for the responses. Keep them coming. You are certainly giving me something to talk about versus the chorus of crickets I have been listening to for the past two weeks.

    I will continue to respond to your questions and try to answer them all if you promise to keep asking. :)

    Look, I am not saying the survey is perfect, but if there is one thing I learned back in Survey Methods class was you don't change or make many major modifications to the base of the survey if you are planning on comparing data year over year or month over month. I have learned a lot about how I would change things in the future surveys, but for now it has remained basically the same to allow me to see if there are any trends.

    Yeah, I agree, some of the questions suck. I will give you that and frankly some of the data goes unnoticed as they really do not indicate anything except that the number remains the same or moved a little bit. However, if you look at the rating and then also look at the comments that A LOT of people provided, there were some really amazing indicators of what the numbers meant. It just takes a lot of time to sort through it all. You know, asking / guilting / shaming people into taking the survey certainly can back fire. Do you know how much time it takes to go through all of the data of a survey of this size, especially if you get 700+ responses? It is a full time job for at least 2 full weeks. Juggling that with an already full load is certainly not a small task, but it is something I take very seriously. I love being a Community Lead. I certainly think this is probably one of the best jobs I have ever had. How many people do you know have a job like this? I get to talk to people like you, here your questions, concerns, suggestions and take this data back to the product team and push for change.

    I will give you an example of one of the things that we learned from last year's survey:

    In last year's survey, CSS got hammered on the overall rating and there were a HUGE amount of NEGATIVE comments as compared to the year prior. Believe me, the CSS team and I read each and every comment and tried to and I think succeeded in gaining some insight on what people were so worked up in lather about. There was supporting information from a 3 day Deep Dive that we did previous to the Survey at the MVP Summit with the SBS MVPs and one of the predominent themes of the conversation with the MVPs was the lack of quality and consistency in the support that they got from CSS. They gave us this feedback with one caveat, that the CSS team in Las Colinas was first rate. The problem as they saw it was that they had to escalate through the first tier engineers to get to the second tier (Texas).  They felt that the level of support that some of the new people on the block was not as good as more seasoned engineers who have been supporting SBS since the stone ages. The CSS management team already knew these concerns and were working tirelessly to continue to grow and train their team in the US and abroad. Since that last survey and MVP Summit Deep Dive, I feel that the quality of support has improved tremendously. I could be wrong and I think the survey would be one good indicator of whether or not the perception of thier support has improved or not. So, if you have an opinion on CSS that you would like to pass along, here is your vehicle.

    I could go on, but I think I have exceeded my limit for a reply to a comment.

    Please just do me a favor, don't stop bringing it. We want to hear from you. We want to know what you think we are doing wrong, what we are doing right and what we need to start doing that we aren't doing now.

    Thanks again,

    Kevin

  • Susan: Yes, I did miss that in the first post. I'm not sure I read the first post. It was probably not Kevin's blog that led me to the survey but rather one of the dozen or so other community blogs, so I jumped right to the survey and started answering the questions. And perhaps I'm shooting others in the foot by not pushing the survey, but I'm not shooting myself. I have provided plenty of feedback to Microsoft in various forms over the years, and anyone there who wants to hear more of my opinions knows how to reach me.

    Kevin: Your rationale for not changing the questions from year to year makes perfect sense IF the basic survey design is sound. But the basic survey design here is fundamentally questionable because you're not getting a random sample. You're only getting motivated respondents, and you really have no way of knowing which way that will skew the answers. And even if it were a random sample, the collection of the same meaningless data year after year does not make it meaningful. (I'm thinking of question 5 in particular.)

    I commend you for trying to get feedback on various resources, but why are you using this tool? If I recall correctly, CSS and the monitored newsgroups have their own satisfaction surveys that immediately follow the incident. If I were the CSS manager, I'd put a lot more stock in comments and ratings from people who had just finished receiving support than from people who may or may not have had any interaction with CSS in the last year. (Just because you tell people to base their ratings on the last 12 months doesn't mean they will.)

    As for gathering data on which MS community resources are preferred by the community, surely you have a more objective way of measuring that. For example, I assume you know exactly how many people attend each SBS webcast, event, chat, etc., and I know that most of these events ask participants for satisfaction ratings at the end. That must be a more valid indicator of popularity and satisfaction than what you're getting from the survey. I also assume that you know how many hits and inbound links are associated with each KB article, TechNet article, MS blog, SBS page, etc. Again, this objective data ought to be given more weight than the 5-point-scale ratings of some self-selected survey respondents.

    I agree with you that there can be tremendous value in open-ended qualitative responses, but as you point out, analyzing hundreds of responses to the question "please make whatever comments you'd like to make" is downright difficult. If you are seeking specific feedback on specific resources that you or others are providing, you'll probably get data that is richer and easier to analyze if you ask pointed questions in this blog, or in the mssmallbiz yahoo group, or in a poll on the mssmallbiz sharepoint site, etc.

    One thing I'm sure of is that the SBS community is not a bunch of crickets. Ask Eric Ligman if he ever feels at a loss for feedback. We won't hesitate to tell you what we like and what we don't like. We just have to be asked the right way.

  • Kevin,

    It's nothing personal and I work with a lot of great people at Microsoft and some are even my friends - I think? Yes, there have been changes as you outlined and the Managed Newsgroups are 24/7, changes in Action Pack qualification criteria. But and it's a big but from me, is where is the Strategic view of SBSC from Microsoft? What's your vision and aim for it? When can we move away from the numbers game and just get Microsoft to work with its most proactive SBSC Partners? How about raising the bar on SBSC and making it an ever more valuable and respcted Certification in SMB? There has been lots of feedback on this.

    Cheers,

    Vijay

  • I figured a new blog post would be better than continuing a thread of comments on the previous blog post.

Your comment has been posted.   Close
Thank you, your comment requires moderation so it may take a while to appear.   Close
Leave a Comment