Main Nav

We have been conducting customer satisfaction survey’s for several years for our central IT services and have seen the responses decline over the years. Last year we made some changes to try and bring the response rate back up. We were not successful. We are looking for ideas on how others do survey’s, your response rates, tips on increasing responses including incentives and anything else you wish to share (please share any public websites we can view). This is one large survey sent to everyone (students, faculty and staff) one time per year. Below is what we did last year to try and change the trend:

·         We changed the tool to include skip logic to allow respondents to answer questions that only pertained to them and the services they use.

·         We lead into the survey distribution with stories of what we did over the previous year that lined up with some of the feedback we received in the previous survey (not all actions were as a direct result, but if work was already underway that addressed some of their concerns, it was included).

·         We moved the distribution date away from December and sent out in March (looking for the biggest lull in campus activities).

 

We appreciate any advice or feedback you can provide for what has worked for you. Here is what we have kicked around so far:

·         Break the survey out by audience and target that way.

·         Break the survey out by service and target that way.

·         Look for incentives for taking the survey (open to any idea).

·         Target those individuals that respond to our service desk survey’s for tickets resolved (we randomly survey staff as they receive support from our service desk as the tickets are resolved). We thought that maybe this group would be a source to reach out to as they have been responsive to date.

·         Work with student groups to get their feedback on what might incent them to take the survey.

 

Thank you for any assistance you can provide.

 

 

Dwight Snethen

Director, Customer Service, IT Service Management and Quality Assurance

ITaP Customer Relations

Purdue University

Stewart Center: Room B14E

765-496-1035

 

Self Help Knowledgebase: http://www.purdue.edu/goldanswers

Assisted Technical Support: 765-494-4000

 

********** Participation and subscription information for this EDUCAUSE Constituent Group discussion list can be found at http://www.educause.edu/groups/.

Comments

Comments/thoughts follow yours:

 

 

We have been conducting customer satisfaction survey’s for several years for our central IT services and have seen the responses decline over the years. How are you using the results?  This will help determine the response rate you need…rather than lament a “lower” response rate.  You may not need as many as you’re getting. Last year we made some changes to try and bring the response rate back up. We were not successful. We are looking for ideas on how others do survey’s, your response rates, tips on increasing responses including incentives and anything else you wish to share (please share any public websites we can view). This is one large survey sent to everyone (students, faculty and staff) one time per year. Below is what we did last year to try and change the trend:

·         We changed the tool to include skip logic to allow respondents to answer questions that only pertained to them and the services they use.  Nice improvement.  But, if you have low response rate, those who are not responding will not know that you changed the logic. 

·         We lead into the survey distribution with stories of what we did over the previous year that lined up with some of the feedback we received in the previous survey (not all actions were as a direct result, but if work was already underway that addressed some of their concerns, it was included).  Excellent.  So you’re telling them what you did with the information they gave you.  Do you also make the survey results available?  Charts/graphs?

·         We moved the distribution date away from December and sent out in March (looking for the biggest lull in campus activities).  Sounds like a good idea…depends on your campus of course.  If you do it during a student break, you may not have as many around?

 

I would suggest that your focus should be on the email inviting them to take the survey.  For example, you could tell them how long it will take them to complete the survey (with the skip logic, it should go faster).  The email is where you need to “hook” them to do the survey.  The email is your SALES pitch and should grab their attention. 

 

We appreciate any advice or feedback you can provide for what has worked for you. Here is what we have kicked around so far:

·         Break the survey out by audience and target that way.  If it will make it shorter to fill out, is feasible.  We did annuals separate by target audience so we didn’t have to have to ask them for that demographic. 

·         Break the survey out by service and target that way.  It would help with the demographic questions

·         Look for incentives for taking the survey (open to any idea).  Our student responses were excellent – we gave a pizza party to the dorm with the most responses.  We also put names in a drawing for a digital camera.  (did both in conjunction) and we got great levels of response.

·         Target those individuals that respond to our service desk survey’s for tickets resolved (we randomly survey staff as they receive support from our service desk as the tickets are resolved). We thought that maybe this group would be a source to reach out to as they have been responsive to date.  Disagree – you already get their feedback (and they are ones which will likely respond anyway).  You need to hear from your customers who are not being heard.

·         Work with student groups to get their feedback on what might incent them to take the survey.  Food works here.  See above.

 

Some general suggestions:

1.       Make it a marketing campaign – in other words focus on the email requesting they fill out the survey.  Sell it.  Tell them how quick it can be done.  Tell them how you’ll use the results. Tell them where they can go see the results (people like to see the results).

2.       Focus on making the survey SHORT and to the point.   This requires that you fully identify how you will use the results. 

1.       I’ve found that there are usually many questions that you don’t need to ask – but someone “wants” the answers even though they won’t USE them.  Eliminate these. 

2.       Get rid of as many demographic questions as you can (staff, fac, stud/ service you used/ etc.).  You can do this sometimes by using different surveys for each demographic.  Sometimes you don’t need the demographics.  The goal is to make the survey easy and quick to fill out.

3.       Check your questions on the survey.  Are they long?  Are they hard to decipher?  They should be as simple and clean, to the point as possible.

4.       Ensure every question will provide a usable answer (check your wording) AND identify how each will be used.  You don’t have to share this with the customers, but it will ensure you don’t have to ask more than one question to get to your answer AND allow you to delete ones that are not really needed.

3.       We made our survey a certain length (10 questions) so that we could tell others who asked to be included (because it would be “interesting to know” something) that we couldn’t add their questions because we already had the 10 most critical questions.

4.       Find incentives that work for each audience.  Food works for students, but not for faculty. 

5.       Build a simple webpage (website) where you have the stories of what you did with the information, charts and graphs showing the results, and PICTURES.  When they complete the survey, have them automatically sent to the webpage.

6.       Check on how many responses you need.  It’s always nice to have high response rates…but what is yours?  Is it high enough?  You can do statistical analysis on the number of responses and that will tell you if your number of responses are good enough.  You want a 95% confidence that the results are representative of your populace. 

 

Marty

 

 

Thank you for any assistance you can provide.

 

 

Dwight Snethen

Director, Customer Service, IT Service Management and Quality Assurance

ITaP Customer Relations

Purdue University

Stewart Center: Room B14E

765-496-1035

 

Self Help Knowledgebase: http://www.purdue.edu/goldanswers

Assisted Technical Support: 765-494-4000

 

********** Participation and subscription information for this EDUCAUSE Constituent Group discussion list can be found at http://www.educause.edu/groups/.

********** Participation and subscription information for this EDUCAUSE Constituent Group discussion list can be found at http://www.educause.edu/groups/.

Would you all be willing to share the survey questions?

 

 

Dwight Snethen

x61035

 

Hi Dwight,

 

I read this awhile back, but I’m just getting to a response now.  I hope it’s still helpfulJ 

 

Here are my thoughts on the following topics -

 

Incentives – The literature is mixed on whether incentives actually work.  For most paper surveys, pre-paid incentives work better than post-paid incentives.  I’m not sure whether this is different for web surveys, however.  At Carnegie Mellon, incentives only marginally increased response rates.  If you’re going to do it, I’d suggest fewer, higher value prizes over a drawing for several smaller prizes.  And Marty’s right.  Students do love food.

 

Length of survey – Keep it short.  Surveying is easy these days and people are over surveyed.  Respondent fatigue is high these days.  In fact, response may have dropped over the years due to this fact alone.

 

Time of year  - you mentioned that you moved the survey to the spring.  Does the launch period now overlap with another major survey?

 

Time of week for launch – Take a look at response logs for past surveys.  On which day of the week are people most likely to respond?  This might be a good launch day.  At CMU, we tried to send faculty surveys out on Friday afternoons and student surveys out on Sunday evenings.  The idea here was that faculty are wrapping up their days on Fridays and might like an easy cross off the to-do list.  On Sundays, most students are preparing for the coming week and they might enjoy an easy exercise to get their week started.

 

Length of launch - This may sound counter-intuitive, but you might try making the survey window shorter.  Folks will feel more urgency around completing the survey if they know it will close sooner than later.  They might even fill it out right when they get the invitation instead of letting it fall to the bottom of their inbox.  We used to see marginal benefits to keeping a survey open any longer than one week. 

 

Survey reminders – Set a firm reminder schedule for each survey to remind your non-responders that they haven’t yet replied.  I suggest sending a reminder in the middle of the launch period and again on the last day.  These email reminders are great for getting little bumps in response. 

 

That pretty much does it for my advice on additional things you might consider.  I’m really glad you’re thinking about customer satisfaction.  I’ll be conducting an ECAR study on customer satisfaction this spring.  We’d like to try to collect as much information as possible about the topic and then construct a common CS survey that any school can use for their services.  Then, eventually, the aggregate responses could be contributed to Core Data so that schools can benchmark on these trends.  This is a few years off though.  Please let me know if you’d like to discuss our survey further.  We’re really interested in everyone’s challenges and priorities for this topic.

 

Thanks!

Leah

 

 

Leah Lang

Senior IT Metrics and Benchmarking Analyst

Educause

llang@educause.edu

 

 

 

How do you all handle survey results for your team for a ticket that was handled by another team? Do you view the satisfaction survey’s to be for all of IT or for the group handling the case? If for the group handling the case and you get a low score, but the comments are clearly directed at another support unit, do you treat that differently: delete, reassign the score or leave in the mix?

 

Dwight Snethen

Director, Customer Service, IT Service Management and Quality Assurance

ITaP Customer Relations

Purdue University

Stewart Center: Room B14E

765-496-1035

 

Self Help Knowledgebase: http://www.purdue.edu/goldanswers

Assisted Technical Support: 765-494-4000

 

********** Participation and subscription information for this EDUCAUSE Constituent Group discussion list can be found at http://www.educause.edu/groups/.

Hi Dwight,

 

This is a common dilemma for 1st level support teams, as the customer doesn’t usually know (or even care) which team provided the service. When they get the survey, they are usually responding to the end-to-end service received, depending on how the questions are framed. Below are a few ideas for how I’ve handled this in the past:

 

·         To narrow the focus and only get feedback on the Service Desk: use specific questions, such as: 

o   “Please evaluate the analyst that initially responded to your request for service” or

o   have sections like First Contact and Last Contact – those queues will provide a frame of reference for the survey participant to focus on the particular aspect of the service you’re interested in evaluating.

 

·         Another variation I’ve tried – we believed our customer didn’t distinguish between the Service Desk and other teams. To strictly focus on ONLY those responses that were related specifically to the Service Desk, I asked “Did the Service Desk resolve your question?” Then, I compared this response to the actual resolving group.

o   If these responses were in agreement (“yes, the service desk resolved my issue” and “Service Desk” was the recorded as the resolving group) then the survey results were accepted and tabulated as reflecting on Service Desk performance.

o   If however, they answered “yes, the Service Desk resolved my issue” and another group was recorded as resolving, then I marked all responses as invalid for Service Desk evaluation. It was definitely an eye-opening exercise – we were surprised to learn how many people perceived the Service Desk as being responsible, or actually resolving, all issues reported into IT.

·         Another option is to use surveys immediately after the call ends, utilizing utilizing  ACD/IVR technology.

·         Finally, you can take the position that if the Service Desk truly embraces ITIL standards and maintains complete ownership of all issues reported to the Service Desk, then all responses, good and bad, should be used to evaluate performance.

 

I think there is value in keeping the results and comparing – may be interesting to look at the trends.

 

If you want to discuss further, I’d be happy to help.

 

Anita Nichols

UC Davis

530.752.4386

Thanks.

Dwight Snethen
Director, Customer Service, IT Service Management and Quality Assurance
ITaP Customer Relations
Purdue University
Stewart Center: Room B14E
765-496-1035

Self Help Knowledgebase: http://www.purdue.edu/goldanswers
Assisted Technical Support: 765-494-4000

Close
Close


Annual Conference
September 29–October 2
Register Now!

Events for all Levels and Interests

Whether you're looking for a conference to attend face-to-face to connect with peers, or for an online event for team professional development, see what's upcoming.

Close

Digital Badges
Member recognition effort
Earn yours >

Career Center


Leadership and Management Programs

EDUCAUSE Institute
Project Management

 

 

Jump Start Your Career Growth

Explore EDUCAUSE professional development opportunities that match your career aspirations and desired level of time investment through our interactive online guide.

 

Close
EDUCAUSE organizes its efforts around three IT Focus Areas

 

 

Join These Programs If Your Focus Is

Close

Get on the Higher Ed IT Map

Employees of EDUCAUSE member institutions and organizations are invited to create individual profiles.
 

 

Close

2014 Strategic Priorities

  • Building the Profession
  • IT as a Game Changer
  • Foundations


Learn More >

Uncommon Thinking for the Common Good™

EDUCAUSE is the foremost community of higher education IT leaders and professionals.