Survey Tips

ITCOMM Community Resources: Survey Tips

These tips were provided by ITCOMM professionals from various institutions in response to the following question posted on the ITCOMM email list on February 19, 2015:

"We're interested in surveying the subscribers to our IT-related email lists to assess engagement and satisfaction with those communication channels. Has anyone done a survey like that? Could you share your survey design and questions?" —Nancy Novitski, Strategic Communications Specialist, University of Oregon

From Alison Cruess, Assistant Director of Communications & Training, University of North Florida:

I have not conducted a survey of email list subscribers, but I can offer some general survey tips and a matrix question I drafted for you.

  1. Start with your survey objectives. (Have this decided before you write your first question.)

  2. Ask yourself on EVERY question: Does this question match our intended objective? If not, drop the question.

  3. Keep the survey as short as possible — the shorter the survey, the higher your response rate.

  4. Use clear and concise questions and options.

  5. Always give the respondent a way out (e.g. with an "other" or "I don't know")

  6. Think hourglass structure — begin with broad questions, narrow down and focus in, then end with more generalized questions.)

  7. Minimize open-ended questions that require deep thought. Many respondents will skip them or abandon the survey if they are required. Don't make a question required unless it is essential to your survey.

  8. Use branching/conditional logic to target the questions to the respondents and eliminate unnecessary questions.

  9. Test your survey (focus group) before distributing.

Based on the objective you listed below, here is a matrix question I put together (quickly). For any selection of Disagree or Strongly Disagree, I suggest you have an open-ended question display (use branching) that asks them to explain why.

How satisfied are you with these aspects of the communication you receive from IT through the <list name> e-mail list?

From Rick Lesniak, IT Policy and Communications Officer, University at Buffalo (SUNY):

Alison's list is excellent! Looking over her tips, they provoked some other thoughts:

  1. Surveys are expensive. They take time to develop, administer, analyze and report. Additionally, each survey uses up a bit of "customer tolerance" that needs some time to recharge.

  2. With that being said, are there any other ways to get at the data you seek? Log files from list servers, responses from other surveys or measurement that could be used deductively towards the objectives, etc.

  3. Objectives. Alison starts with the assumption that you know your survey objectives, but in my experience that step is often overlooked and turns out to be crucial to obtaining actionable results. I think in terms of the "research question(s)". When I first read "assess engagement and satisfaction with those communication channels", my initial reaction was "I wonder why?", aka what's the research question. What problem/pain point is being encountered? Are we hearing anecdotes about lack of communication? Difficulty in using the current tools? ...I'm still curious about this. Once the research question is articulated in writing, then derive the informational objectives from the question(s).

  4. Don't ask yourself about whether the question does or does not meet the objective. Ask a non-involved beta-subject, ensuring this person is not already "thinking like you". Give them the question, and ask them to verbally respond to you. You will learn enormous information on how seemingly minor question wording makes tremendous differences in responses. You will filter out our IT jargon, and reduce questions to simple, plain language.

  5. Lastly, be very careful with "satisfaction". You might want to focus on expectations, service levels, actual performance goals, etc. Satisfaction is very often misunderstood, simplified and mostly meaningless. Let's not forget that those "highly dissatisfied" and "highly satisfied" responses do not average to neutral! You really want more from your survey responses, and focusing on those statistical "tails" will make or break your service improvement plan.